https://queue.acm.org/detail.cfm?id=1983083
The main purpose of the stack is to facilitate garbage collection. The GC can find out what Lua objects are currently being manipulated by the C client. The price is that the Lua API can never expose a "Lua object pointer" to C. Many other scripting languages expose such pointers, but then must also expose ways to manage their memory. For example, in Python the user of the API must explicitly increment and decrement reference counts.
> Traditionally, most virtual machines intended for actual execution are stack based, a trend that started with Pascal’s P-machine and continues today with Java’s JVM and Microsoft’s .Net environment. Currently, however, there has been a growing interest in register-based virtual machines (for instance, the planned new virtual machine for Perl 6 (Parrot) will be register based). As far as we know, the virtual machine of Lua 5.0 is the first register-based virtual machine to have a wide use.
The API directly reflected the (previous) internals of the VM, I guess [1].
[1] (pdf) https://www.lua.org/doc/jucs05.pdf
> Why?
I suspect it's because implementing control mechanisms is much easier when you reify the language's stack. Especially advanced ones such as generators and continuations which require copying the stack into an object and restoring it later. Making that work with the native stack is really hard and many languages don't even try.
It also makes garbage collection precise. Lua values in C variables would be placed in registers or the native stack. In order to trace those values, Lua would require a conservative garbage collector that spills the registers and scans the entire native stack. By managing their own stack, they can avoid doing all that.
It might make sense for building complex plugins, but for simple stuff, Lua still excels, and doesn't require much programming knowledge in many cases.
That didn't exist until recently, but now you can use https://github.com/bytecodealliance/wasm-micro-runtime/blob/...
I am not aware of anyone using this yet, but I hope to see that become common in the next few years.
Extism is a plugin framework for WebAssembly:
https://github.com/extism/extism
Visual Studio Code can run WebAssembly extensions: https://code.visualstudio.com/blogs/2024/05/08/wasm
A project to bring WebAssembly plugins to Godot: https://github.com/ashtonmeuser/godot-wasm
WasmEdge can be embedded in applications: https://wasmedge.org/docs/embed/overview
Wasmer can be embedded in applications: https://blog.wasmer.io/executing-webassembly-in-your-rust-ap...
Wasmtime can be embedded in applications: https://docs.wasmtime.dev/lang.html
> Terra is a low-level system programming language that is embedded in and meta-programmed by the Lua programming language.. Like C/C++, Terra is a statically-typed, compiled language with manual memory management. But unlike C/C++, it is designed from the beginning to be meta-programmed from Lua.. In Terra, we just gave in to the trend of making the meta-language of C/C++ more powerful and replaced it with a real programming language, Lua.. Terra programs use the same LLVM backend that Apple uses for its C compilers. This means that Terra code performs similarly to equivalent C code.
From Pat Hanrahan's research group at Stanford, https://amturing.acm.org/award_winners/hanrahan_4652251.cfm
Hanrahan and his students developed Brook, a language for GPUs that eventually led to NVIDIA’s CUDA. The prevalence and variety of shading languages ultimately required the GPU hardware designers to develop more flexible architectures. These architectures, in turn, allowed the GPUs to be used in a variety of computing contexts, including running algorithms for high performance computing applications, and training machine learning algorithms on massive datasets for artificial intelligence applications.
Papers: https://terralang.org/publications.html> Please note: The entire "windows" portion of that article has largely been resolved at this point, because I fixed it myself.
That being said, I don't think either of these languages is practical for real world projects for several reasons: lack of (or low quality) documentation, bus factor (lack of developers), tiny communities, lack of tools, lack of libraries, etc etc etc.
--
Maybe not yet, but prior work by this research team eventually lead to CUDA. Terra may be useful as a productivity DSL for some high-performance computations, e.g. listed in their papers:
our DSL for stencil computations runs 2.3x faster than hand-written C
our serialization library is 11 times faster than Kryo
our dynamic assembler is 3–20 times faster than Google’s Chrome assembler
2021 HN thread, https://news.ycombinator.com/item?id=27334065> Terra is a workhorse and it gets the job done.. Having first-class code generation capabilities has been really nice, especially with built-in support for things like parsing C header files, vectorization, and CUDA code generation. The language is stable and generally doesn't have many surprises. Unlike, say, C++ or Rust, there aren't as many corners to hide odd behaviors.
Can it use anything else (as an option), e.g. Lua? That would be useful during development/debugging thanks to faster iteration and memory safety.
> 1.2k developers have produced a 6.9M line code base with an estimated price tag of $530M.
The problem is that LLVM is very slow (as it applies lots of optimisations) and heavy, and that makes a compiler slow even in debug mode.
D has 3 backends, LLVM is used by the LDC compiler. Then there's DMD which is the official compiler (and frontend for all three), and GDC which uses GCC backend. D code typically compiles quite a bit faster on DMD, but the code is more highly optimised with LLVM or GCC, of course. It's a great tradeoff specially since all compilers use the DMD frontend so it's almost guaranteeed that there's no differences in behaviour between them.
> Terra programs use the same LLVM backend that Apple uses for its C compilers
lol is this a flex? or is it just dated? everyone uses LLVM for all of their compilers.
Bump CI to macOS 13
Add AVX512 support
Update LuaJIT
Document AMD, Intel, Nvidia GPU support
Add support for LLVM18
Support SPIR-V code generation
It's also worth noting that the interface is clean in such a way that it is straightforward to automatically generate bindings. In my case, I used a handful of Roslyn Incremental Source Generators to automatically generate bindings between C# and Lua that matched my overall architecture. It was not at all difficult because of the way the interface is designed. The Lua stack together with its dynamic typing and "tables" made it very easy to generate marshallers for arbitrary data classes between C# and Lua.
That said, there are plenty of valid criticisms of the language itself (sorry, not to nitpick, but I am really not a fan of one-based indexing). I'm thinking about designing an embedded scripting language that addresses some of these issues while having a similar interface for integration... Would make for a fun side project one of these days.
It is very funny how this is just the one sole criticism that always gets brought up. Not that other problems don't exist, but they're not very talked about.
Lua's strength as a language is that it does a lot quite well in ways that aren't noticeable. But when you compare things to the competition then they're quite obvious.
E.g. Type Coercion. Complete shitshow in lots of languages. Nightmare in Javascript. In Lua? It's quite elegant but most interestingly, effortlessly elegant. Very little of the rest of the language had to change to accomodate the improvement. (Excercise for the reader: Spot the one change that completely fixes string<->number conversion)
Makes Brendan Eich look like a total clown in comparison.
Eventually you end up in a place where it's beneficial to have converter functions that show up in the call stack frames so that you can keep track of whether the index is in the right "coordinate index system" (for lack of a better term) for the right language.
One can read through “mystdlib” part of any meaningful Lua-based project to see it. Things you’ll likely find there are: NIL, poor man’s classes, __newindex proxying wrapper, strict(), empty dict literal for json, “fixed” iteration protocols.
You don’t even have to walk too far, it’s all right there in neovim: https://neovim.io/doc/user/lua.html
Why numbering should start at zero. -- Dijkstra
To me 1 based indexing is natural if you stop pretending that arrays are pointers + index arithmetics. Especially with slicing syntax.
It's one of the things that irked me when switching to Julia from Python but which became just obviously better after I made the switch.
E.g. in Julia `1:3` represents the numbers 1 to 3. `A[1]` is the first element of the array, `A[1:3]` is a slice containing the first to third element. `A[1:3]` and `A[4:end]` partitions the array. (As an aside: `For i in 1:3` gives the number 1, 2, 3.)
The same sentence in python:
`1:3` doesn't make sense on its own. `A[0]` is the first element of the array. `A[0:3]` gives the elements `A[0], A[1]` and `A[2]`. `A[0:3]` and `A[3:]` slice the array.
For Python, which follows Dijkstra for its Slice delimiters, I need to draw a picture for beginners (I feel like the numpy documentation used to have this picture, but not anymore). The Julia option requires you to sometimes type an extra +1 but it's a meaningful +1 ("start at the next element") and even beginners never get this wrong.
That said, it seems to me that for Lua, with the focus on embedding in the C world, 0 index makes more sense.
"when starting with subscript 1, the subscript range 1 ≤ i < N+1; starting with 0, however, gives the nicer range 0 ≤ i < N"
So it's "nicer", ok! Lua has a numeric for..loop, which doesn't require this kind of range syntax. Looping is x,y,step where x and y are inclusive in the range, i.e. Dijkstra's option (b). Dijkstra doesn't like this because iterating the empty set is awkward. But it's far more natural (if you aren't already used to languages from the 0-indexed lineage) to simply specify the lower and upper bounds of your search.
I actually work a lot with Lua, all the time, alongside other 0-indexed languages such as C and JS. I believe 0 makes sense in C, where arrays are pointers and the subscript is actually an offset. That still doesn't make the 1st item the 0th item.
Between this, and the fact that, regardless of language, I find myself having to add or subtract 1 frequently in different scenarios, I think it's less of a deal than people make it out to be.
If you then even consider the simple case of nested arrays, I think it becomes really difficult to defend 1-based indexing as being cognitively easier to manipulate, as the unit of "index" doesn't naturally map to a counting number like that... if you use 0-based indexes, all of the math is simple, whereas with 1-based you have to rebalance your 1s depending on "how many" indexes your compound unit now represents.
If the word for word same argument was made by an anonymous blogger no one would even consider citing this as a definitive argument that ends the discussion.
The real gripes should be globals by default and ... nothing. Lua is wonderful.
I get you are taking the piss but ~= is just as logical as != being the symbols for: "not equals", if you've been exposed to some math(s) as well as ... well, ! means factorial, doesn't it?
Syntax is syntax and so is vocabulary! In the end you copy and paste off of Stack Exchange and all is golden 8)
[attr~=value]
Represents elements with an attribute name of attr whose value is a whitespace-separated list of words, one of which is exactly value.
[1] https://developer.mozilla.org/en-US/docs/Web/CSS/Attribute_s...I'm a big fan of Lua, including for the reasons you mention. I suspect the reason this one thing is always brought up is twofold: it's easy to notice, and it's very rare these days outside of Lua (if you consider VB.NET to be a legacy language, anyway). Other criticisms take more effort to communicate, and you can throw a rock and hit ten other languages with the same or similar issues.
To be fair, Brendan Eich was making a scripting language for the 90's web. It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.
IIRC, Eich was quite influenced by Python's design. I wish he'd just used Lua - would likely have saved a lot of pain. (Although, all that said, I have no idea what Lua looked like in 1994, and how much of its design has changed since then.)
If you don't know what Lua was like then, don't wish that I'd "just used Lua".
Other issues include Netscape target system support, "make it look like Java" orders from above, without which it wouldn't have happened, and more.
It sounds like you're saying Yoz got the sequence of events wrong, and that MILLJ was a necessary part of getting scripting in the browser? I sort of had the impression that the reason they hired you in the first place was that they wanted scripting in the browser, but I wasn't there.
I don't think Lua was designed to enforce a security boundary between the user and the programmer, which was a pretty unusual requirement, and very tricky to retrofit. However, contrary to what you say in that comment, I don't think Lua's target system support or evolutionary path would have been a problem. The Lua runtime wasn't (and isn't) OS-dependent, and it didn't evolve rapidly.
But finding that out would have taken time, and time was extremely precious right then. Also, Lua wasn't open-source yet. (See https://compilers.iecc.com/comparch/article/94-07-051.) And it didn't look like Java. So Lua had two fatal flaws, even apart from the opportunity cost of digging into it to see if it was suitable. Three if you count the security role thing.
He was, and he doesn't deserve the full blame for being bad at designing a language when that wasn't his prior job or field of specialization.
But Lua is older so there's this element of "it didn't need to be this bad, he just fucked up" (And Eich being a jerk makes it amusing to pour some salt on that wound. Everyone understands it's not entirely serious.)
Which remains one of the most baffling decisions of all time, even to this day. Javascript is unpleasant to work with in the browser, the place it was designed for. It is utterly beyond me why anyone would go out of their way to use it in contexts where there are countless better languages available for the job. At least in the browser you pretty much have to use JS, so there's a good reason to tolerate it. Not so outside of the browser.
The pressure to evolve JS in a more fair-play standards setting rose and fell as browser competition rose and fell, because browser vendors compete for developers as lead users and promoters. Before competition came back, a leading or upstart browser could and did innovate ahead of the last JS standard. IE did this with DHTML mostly outside the core language, which MS helped standardize at the same time. I did it in Mozilla's engine in the late '90s, implementing things that made it into ES3, ES5, and ES6 (Array extras, getters and setters, more).
But the evolutionary regime everyone operated in didn't "decide" anything. There was and is no "Silicon Valley" entity calling such shots.
Oh come on, you understand full well that they're referring to the wider SV business/software development "ecosystem".
Which is absolutely to blame for javascript becoming the default language for full-stack development, and the resulting JS-ecosystem being a dysfunctional shitshow.
Most of this new JS-ecosystem was built by venture capital startups & tech giants obsessed with deploying quickly, with near-total disregard for actually building something robustly functional and sustainable.
e.g. React as a framework does not make sense in the real world. It is simply too slow on the median device.
It does, however, make sense in the world of the Venture Capital startup. Where you don't need users to be able to actually use your app/website well. You only need that app/website to exist ASAP so you can collect the next round of investment.
;)
It's been a while since I wrote any Lua but I didn't think it was intuitive at all. Why are arrays tables, why are they nil terminated and why do they start at 1.
I don't think it's underrated either. It's very popular for writing scripts in (older) game engines.
Exactly, it seems popular in places where the alternative is even less usable (e.g. write your own C++ plugin or thing). For that it's fine.
For example, there is a lua plugin for x-plane and it's great. I've done some minor things with it even. But that didn't make me a fan of the language. It's fine for simple things. But a bit limited.
The x-plane API is pretty low level and written in C++. So something that sits on top and is a bit more accessible is great. There are a lot of lua scripts available that you can run in there that are pretty nice. But the more complicated ones get pretty hairy quickly. Makes you long for something more structured.
I don't have the time, but I did consider working on adding a websocket API at some point to make it easier to integrate with from e.g. a web page or some kind of standalone process. I think they are now planning to add that feature themselves so there's no need to work on this anymore. And they have a low level UDP API currently that exposes all sorts of things. But a websocket API with some good client libraries for different languages would make a lot of sense.
For the same reason why (standard) arrays are associative in PHP and JS, as well - because, from a certain perspective, a regular array is just an associated array with consecutive integers as keys, so if you're trying to keep things simple, it seems like a straightforward unification.
> why are they nil terminated
It follows from them being associated arrays, and Lua's pervasive use of "nil" meaning "this thing is missing".
> why do they start at 1
Because humans generally start counting from 1.
I’m not parent, but for me the question is not literally why did they choose 1-based indexing. That’s easy to answer. The question is really: why haven’t they realized by now that the disadvantages of 1-based indexing far outweigh the benefits when it comes to programming, and changed it? Backward compatibility is a valid reason, but lots of other people have tried 1-based arrays and reverted to 0-based once it becomes clear that 1-based arrays causes more problems than it solves. Even humans who count starting from one will soon want to use 0-based the moment they need to add two array indexes together; we don’t want to have to start from 2 in that case, right? Early versions of the famous Numerical Recipes in C started with 1-based arrays, but removed them, and got slightly better. :P
I might be able to provide at least some insight into that.
One of the things I found while implementing my language was that the ordering of elements in a hash table is essentially random. Literally random if you mix in some randomness into the hash function's initial state in order to harden it against collision attacks.
Many languages guarantee insertion ordering. It's a useful property to have, makes testing easy and deterministic and it's intuitive for users. A simple way to implement that is to put a perfectly normal array of values into the hash table, and then have the buckets which the keys hash into contain indexes into that array. That way the indexes of values remain stable no matter what the keys hash to.
So a sufficiently advanced hash table will have an array of values inside it. I suppose some language designers took one look at that situation and decided to just merge both types already by using hash tables for everything. For example, PHP also did that. Associative arrays, they call it.
"Associative array" is just the term used for hash tables (dicts) in Awk, bash, Tcl, and, most relevantly for PHP, Perl4. It doesn't specifically mean that the language lacks a separate container type for small-integer-indexed arrays. It just means that that particular container type can be indexed by something other than small integers, usually strings.
It doesn’t mean HTs are the arrays which should be used to store seqint-indexed values.
In case anyone is curious about it:
https://mail.python.org/pipermail/python-dev/2012-December/1...
> In addition to space savings, the new memory layout makes iteration faster.
> Currently, keys(), values, anditems() loop over the sparse table, skipping-over free slots in the hash table.
> Now, keys/values/items can loop directly over the dense table, using fewer memory accesses.
The "dense table" is the array of values. The "sparse table" is the actual hash table, only the values index into the values array.
I thought the insertion ordering was implemented at around the same time as a result of it. The data structure ensures it pretty much for free, seemed like the obvious next step. It appears I was mistaken about that.
But I fail to see the logic behind seeing this and merging array/table ideas. Feels like one of these irrational decisions you make in a dream.
And Lua doesn’t use hash table to store arrays. Its tables have two different parts (hash, indexed array) with foggy distribution logic which reflects into “# vs ipairs” issue. This merge is purely synthetic and doesn’t project any underlying implementation idea.
Looking from today's perspective, I also fail to see the logic behind it: separating arrays and dictionaries does not make your implementation more complicated than having a single backing data structure for both and taking care of all the special cases that arise from such a monstrosity.
The best explanation I can think of is that this was just a fad nobody questioned too much. Back in the 1980s and 1990s, there was a strong distinction between "scripting languages" and general-purpose dynamic languages (like Lisp and Smalltalk). Scripting languages were supposed to be used to write small scripts to automate stuff. Type-safety was not something that any mainstream language considered back then (it was a thing in niche languages like Ada, SML and Eiffel), and speed was not a concern for scripting languages.
A common philosophy (though I never seen it clearly stated) for scripting languages seems to have been to "be liberal and try to work with whatever the programmer does".
So you shouldn't require to declare variables in advance (that's a lot of hassle!) or throw an error if a function is called with the wrong number of arguments.
And what if the programmer tries to add an integer and a number written as string? Just coerce them. Or better yet, make everything a string like Tcl does. Or at the very least, don't be a spoilsport and have separate types for integers and floating point numbers!
Ok, so what if the programmer tries to use a string index? We cannot coerce non-numeric strings! Well, why don't we just unify arrays and dictionaries like we've just unified all our number types? Now you don't have to worry about invalid indexes since all indexes are converted to strings! And if the programmers want to use 0-based, 1-based, 2-based or 42-based indexes they can do whatever they want! They can even have non-contiguous arrays or mix arrays with associative dictionaries. Total freedom.
This thought process is completely imaginary. I suspect not a lot of thought was given to this type of flexibility at all back in the day. It was just a widely accepted assumption that scripting languages should be flexible, free and relaxed. And merging arrays and dictionaries (or "all arrays are associative arrays" as it was often called back then) was just one more idea that naturally fit into the zeitgeist, like type coercion and "assignment to an unknown variable sets a global variable by default".
So tldr: it was a fad.
Not to my understanding. What is called "array" in the language is indeed associative, but lists (heterogeneous vectors, like Python's) have always been here and distinct.
But otherwise, insightful post. Awk is also the first thing that came to mind, with its funky arrays and "undefined variables are initialized on first use to the empty string or 0 depending on the context".
I will never use lua again and avoid any project that uses it.
I've also never seen a clean (non trivial) Lua codebase before. They're all spaghetti messes and difficult to follow. On the contrary, some of the worst, least maintainable and difficult to read code I've seen has been in Lua.
For configuration (what it was originally designed for) or for very small, non-critical stuff, then it can be a good choice.
With that said, I know some people are very productive with it, and have made impressive things in Lua. It's just not for me.
This entire comment accurately depicts what it’s like to actually use Lua.
The worst, host horrific codebase I’ve ever worked on was in Lua for some godawful reason.
I remember when I thought like this when I was first learning C++ and thought it was awesome because of all the features built into the language. Since then, I've learned that things that are "barely a programming language at all" like Lua—where you can easily learn the whole language and then write, if necessary, reflective object-oriented effortlessly parametrically polymorphic code—are actually the best languages of all. (Such extreme liberalism does require a compensating amount of discipline, as you say.)
Lua has its flaws (implicit nils, the nil/# interaction, default global, 1-based indexing, etc.) but minimalism is one of its major virtues.
you can easily learn the whole language and then write, if necessary, reflective object-oriented effortlessly parametrically polymorphic code
This is all you can do in these languages. Anything above it will share the fragility and ad-hocness of this meaningless language para-olympics.
Toying with it with all the free time on one’s hands is okay.
Avoiding useful techniques is an option, but doing it when the world of saner alternatives exists feels like self-sabotaging.
The language itself may be a matter of taste. I prefer languages which, "in the face of ambiguity, refuse the temptation to guess," but certainly I've spent enough time using Perl, bash, JS, and BASIC to be able to deal with Lua's relatively mild level of DWIMminess.
I was surprised to find out I'd been wearing javascript on my arm all this time
I’ve also had luck using it in a wasm runtime thanks to quickjs-emscripten. Good if you need sandboxing sans full process isolation.
Lua rocks is a wart on Lua’s piss poor existence, not an advantage. Lol.
Not that this was a hard topic in a language where lifetimes are the main concern and business as usual. In C/etc you can’t hold anything without a ref or some sort of a lifetime guarantee. What makes it special wrt Lua/Python?
Delayed GC seems like it might add headaches for this kind of thing rather than removing them, because your bugs may take a while to show up in testing.
I guess in C one could macro hack himself a similar arena to borrow objects from python runtime.
In C++ it’s a matter of befriending a smart pointer with ref/unref calls. It was probably done hundreds of times in all sorts of python embedding wrappers.
It's funny how through the years I've started to appreciate statically typed languages and type hints (Python, TypeScript) as well. Up to my current stance: it's a disservice for a community to write hintless python or js code.
P.S. Recently I've rewritten 1.5k cloc of JS into TypeScript. As usual there were dozens of cases of missed nulls, null attribute accesses or semantically questionable nulls.
Also the duck typing crowd were rebelling against polymorphism and over-used meta-programming OOP slop, not against types per se.
My argument against types in python was literally: it's hard to model my domain with static types. And we have tests!
I felt types would severely decrease my productivity as 10x coder. Embarrassing, right?
Strongly disagree. If you are you using Python or Javascript or any other duck typing language you are supposed to know what you are doing and types are completely unnecessary. I'm amused at how the community in general praises and adopts TypeScript in so many projects. You can just write realiable JS code as it is. I have ever written any hinted Python code and everything just works as expected. If you need some guarantees then I would probably switch to some unit testing which at least for me brings more value that using types.
You can, but I can’t. I often just miss fixing some ident at a little clean up phase after prototyping yet another chunk.
unit testing which at least for me brings more value that using types
Not everyone writes for testing. E.g. I have large-enough ai related scripts in typescript that require types to feel safe to change. But I won’t create tests for these only to find typos and editing leftovers. Would be such a waste of time.
Typescript types are only required in “entry points” and live their own life afterwards thanks to inference. To me it was so worth it that I stopped creating js sandboxes for little experiments and made a ts project scaffolding script to spin up a new app in a few keystrokes.
And when you want to learn about “third party” code, types are irreplaceable. Code tells absolutely nothing compared to types. A single minute in .d.ts is worth an hour in .js.
- Tests are great, but it's hard to maintain 100% coverage that checks all of what types do. It's very possible to have tests that pass that shouldn't. Types can not only find stealthily broken code, but also broken tests.
- Type checking gives even more instantaneous feedback than almost any test runner.
- Most projects have multiple developers. Types don't communicate everything, but they are a reliable form of communication that is guaranteed to be up-to-date even when documentation isn't, which is great combined with good function names and good parameter names.
- Most programs need to deal with requirements that shift, either because they evolve or our understanding evolves. Refactoring code without types is hard: if you change the return value of something, how can you be sure that all of the usages are up-to-date? What about if your refactor has been rebased a number of times before merging? I've run into a real-world production incident caused by this, when a return type changed to a tuple and it only exploded in an important Celery job that didn't have a test around the specific boundary that broke.
"If you know what you're doing" is not enough. Programs change, and they are changed by multiple people concurrently. This is the same reason why Rust continues to grow, people can say what they want but moving errors to the left is the only way forward because even if you're literally perfect you still have to know 100% of what's going on to never merge anything that causes a problem. (But really, people are not perfect, and never will be, so better tools Will win eventually.)
I get perverse thrills out of drawing the attention of the "pfft, types are for n00bs" crowd to the multiple authentication related CVEs in GitLab due to the "well, the parameter types are whatever they are, don't you worry about it"
That said, I'm trying very hard to ween myself off even reading these threads, let alone contributing to them, because much like <https://news.ycombinator.com/item?id=42492508> I'm never going convince them and they for damn sure are never going to convince me
You are human, you simply can't.
I've literally found bugs in MY existing code by a mere rewriting it to TypeScript. It's the same story every single time. Even for projects with 100% test coverage.
It has a massive advantage in the existing ecosystem. People don’t always love it, but millions of them already know it. You don’t need to re-learn array indexing and nil termination quirks and the dozens of other things where Lua is subtly different in ways that you’ll find out at runtime.
There are JS engines in all sizes, from tiny interpreters to world-leading JIT performance monsters. And most operating systems nowadays ship with a high-quality engine and embedding framework that you can access without installing anything, like JavaScriptCore on the Apple platforms.
Do your users a favor and just use the boring standard language instead of making them learn a completely new one that doesn’t offer meaningful practical benefits for them. It’s probably not as much fun for you, but you’ll save time for both the users and yourself in the end.
Anyway, every language has some fans, and I don't mean to dismiss them, but ubiquity is not the top priority, all the time.
Js needs no forwards which in Lua turn into ugly “local foo; … … … foo = function()…end”. All functions are “hoisted” by default.
Js also uses “export”s or “module.exports[…]” instead of returning a symbol table, which allows cross-referencing.
Js has a function scope (var) together with local scoping.
Js doesn’t allow accidental shadowing in the same scope. Lua: local x; local x; silently creates two variables, good luck debugging that closure.
Js type coercion is only insane in the parts where operation itself makes zero sense. Please don’t tell me that ![0] or [] + {} makes sense to you or that you use them in other languages. Or that s+n and n+s seems safe everywhere except in js.
Js has a ternary operator and ?? ?. ??= operator group. Lua only has that “and-or” crutch and “not X” which returns true only for nil and false.
Js “…arr” is a first class citizen while in Lua “…” decays into its last value if you want to follow it.
Js (normally) ordered properties system is much more convenient in dev and debug time than an unordered mess of meta-hashtables.
Js doesn’t require you to
{
myPropertyHere = gotSomeObject.myPropertyHere,
…
IllRepeatThisAsManyTimesAsNeeded = gotSomeObject.causeWhyNot,
}
gotSomeObject.someNumericField = gotSomeObject.someNumericField + 1
which is extremely abhorrent DX in 2024.It is beyond a footgun, more like a footgun concealed as a shoehorn.
Either way I find embedding a Lua VM into a project so much easier than embedding a Javascript one.
Noobs are going to find the footguns in any language. Every language can be misused. I've been using Javascript over 20 years and have never thought it was any worse than any other language. Yes, it's dynamic, and yes, you can do stupid things with that dynamism, but for someone who actually knows how to use it, it's quite easy to work with.
I recently started using Lua and I'm not impressed at all. It's just another language, it's really up to the programmer what can be done with it, just like Javascript.
If this is true, why not write everything in C++?
Javscript includes so many footguns that people know are bad (see the billion of examples of different types being added and getting random answers). Every langauge has some difficult parts, pretending that means any amount is fine is crazy. Javascript has earned its reputation.
Almost every "footgun" people cite about Javascript is something that would never be written as working code, by any competent programmer - they are "WTF" in that the person writing the "gotcha" has to go way outside of normal programming behavior to get the "WTF" result. Most of these "WTF" examples should be taken with a grain of salt.
If you're expecting to "WTF" about Javascript's type inference/coercion, that works according to some logical rules.
https://media.geeksforgeeks.org/wp-content/uploads/advanced-...
This matrix should be pretty easy to understand. Yes some things might seem weird like [[]] == false, and [1] == true, but nobody should be writing code like that, it's just illogical to do anything this way and I don't consider it to be a "gotcha". "Just because you can, doesn't mean you should" can be said about footguns in every language.
Yes, there are "WTF"s, but just like if I were to write a C++ "WTF", it wouldn't be something that most people would be tripped up by unless they were specifically looking for a way to say "WTF" about the language.
These "javascript sucks" internet arguments have persisted for decades, and it's always complaints about the language that are citing unreasonable examples to prove a point that a programming language has some rough edges. Every programming language has some rough edges.
Javascript maybe has more footguns than some other languages because it is so dynamic, and the dynamism has its benefits too that other languages don't have. I do like it and I use it effectively, YMMV. People just seem to be specifically mad at Javascript because it's so popular and was made so by being the defacto standard for web browsers. It's part jealousy and part frustration that their favorite programming language isn't the one that web browsers include by default.
For better or worse, JavaScript is the Lingua Franca of modern programming - you have a high-performance runtime environment pre-installed on basically anything, and readily installed on greybeard boxen if you are that way inclined . It is my "go to" for all hobby projects, frontend and backend. So versatile and a genuine joy to code in.
* Duktape python tooling is obsolete, migrate to JS-based tooling! *
$ python2 duktape-2.6.0/tools/configure.py --output-directory src-duktape \
-UDUK_USE_ES6_PROXY
I'm glad to hear my impression that there was no other build system was wrong!Javascript took an extremely high-complexity, high-cost, and nearly-all-effort-in-just-two-impls-for-over-a-decade route to get to the current (stunning) performance though. How do small-scale embedded interpreters stack up? That's what many things that aren't Electron apps will be using.
I ask this honestly, to be clear - I'm not sure how they stack up. I know Lua has historically targeted and dominated this area so I'd expect it to win, but I have not seen any data and I largely agree - JS is a massively better development environment now, and I would love to choose it. Everything I can find pits it against v8, and that's kinda like comparing the performance of a bike against a train - they are so different in many situations that it's meaningless. Otherwise I'd probably choose WASM today, or maybe docker containers where performance and platform needs isn't an issue.
Moddable's XS runtime is a wonder, with 99%+ ES2023 conformance even on devices running at 80 kHz with 45 KB of free memory. https://github.com/Moddable-OpenSource/moddable
QuickJS has somewhat higher requirements, but also a great candidate for embeddable mobile and desktop applications, and is well-regarded. https://bellard.org/quickjs/
I'm no developer, I do Linux and some bash and Python, and Lua looks "familiar" to me, so I can quickly pull up something simple.
With Javascript everything looks so alien to me, and it's a pity because there is lots of things that run Javascript (functions in the cloud and the like).
JS and Lua are very similar feature-wise and I imagine Lua hasn't gotten the massive JIT runtime optimizations that JS has, so JS seems a far better choice these days.
The fat better arguments are that it's more common and there are more libraries.
So if you only need a small language to do a couple thousand lines of code in than it still might be a better choice. But for larger projects JS still seems better due to the ecosystem and knowledge available.
But if you compare some games like baldurs gate 2 where most of the event logic and quest state is in the scripting language, then maybe it is worth spending the time to add JS. But if it is like Civilization 6 where only small stuff like UI and I/O mapping is in the scripting language then it is a different matter.
Lua as an embedded language already had over a decade head start.
Microsoft also had server-side JS, and still do. Active Server Pages (ASP) were scriptable with javascript since 1996. Microsoft's JScript.net has been around since 2002 (even if not well supported or maintained by M$, it still works). I've written web back-ends, Windows desktop applications, and even .dlls with JScript.net as far back as 2003.
Javascript has been Adobe's scripting engine for Photoshop since 2005. Sony chose Javascript for Vegas Video's scripting language in 2003.
Javascript has so many uses outside of a web browser, and has for a very long time.
But I guess my anecdotal experience doesn't matter, because I wasn't a "mainstream" developer? From my point of view, everyone else was missing out on using one language for front-end and back-end.
Context switching has a real cost in terms of developer burnout, and it's something I've happily avoided for 25 years.
Also there are whole books written on the quirks you need to master with Javascript. Lua has its rough edges but it's a much simpler language, so there are far fewer of them.
And a lot of the issues with JavaScript just wouldn't matter for an "embedded scripting engine" use case imo.
Game dev, possibly (even that seems highly debatable). But the article was making the case that Lua should be more widely used outside of that niche.
I also think 9 out of 10 languages are overrated and we have far too many.
That said, I think there are other warts that can be confusing and make the language difficult. The Nil-Terminated Arrays section points out one oddity about `nil` - but there are many others. There are a number of confusing corner cases (or were - I haven't used it in a while). As an example, if you have a list of values and you pass that list into a new table, the resulting table may be different than if you add each value in that list in order.
I also think that the flexibility of embedding means that you are often learning the Lua <=> C interface as much as you are learning Lua. So jumping from deploy to deploy is more confusing than most languages.
Nevertheless I fully agree the language is underrated!
One time I had a memory leak in a game, something like exactly 4 bytes per frame.
It turned out I was not popping the Lua-C parameter passing stack properly after each frame.
I found this out after some unhelpful person suggested that "all programs just leak memory" lol
Any integration code that doesn’t use an ad-hoc helper library looks like assembly listing (not even in disguise, cause it basically is bytecode but in C).
See my other reply: https://news.ycombinator.com/item?id=42519033
> Any integration code that doesn’t use an ad-hoc helper library
There are mature bindings for most languages. You only need to deal with the C API directly when you
1. Create a new language binding
2. Write a Lua extension
3. Call Lua from C
Typically, Lua is embedded as a scripting or configuration language into another project, and in this case you wouldn't even be aware of the C API.
What I find fantastic about the stack-based API is that it exposes very few interpreter internals. As a result the authors may drastically change the implementation without breaking the public API. Most importantly, they hide the internal object representation and as a user you only ever deal with plain C types, like integers, floats, C-strings and user data pointers. As a consequence, you don't have to think about memory management because all types are passed by value! In fact, someone could theoretically write a Lua implementation with reference counting and the API wouldn't have to change. It really is a very smart and elegant design.
It’s ok, but should be much better.
Once upon a time Supreme Commander used Lua. Perf on consoles for SupCom2 was catastrophically bad. We switched to Kore which was effectively Lua but way faster (10x?).
Kore was bought by Havok and rebranded into Havok Script. Havok got bought by Microsoft. I’m honestly not even sure if you can still buy it today. It’s not listed on their website.
These days I think you’d use Luau? I’m not sure. https://github.com/luau-lang/luau
The most fun was when I made a game engine ("engine") in C++ that used Lua for all the game logic. The fast debug cycles are very useful there. Maybe I should still do that for Rust gamedev, since Rust notoriously compiles slowly.
The second-most fun was an IRC bot that could reload its Lua logic without disconnecting. Nowadays I'd write a bouncer and just let the main process restart. (Or use wasm, a wasm IRC bot would be good...)
The third-most fun was using Lua as a Turing-complete configuration language. If I needed an array of something, I just made Lua make an array, instead of generating Lua code.
One of the least fun was the Blender plugin that exported 3D models as generated Lua. If I did that one again today, I'd just use MsgPack or some ad-hoc binary format.
The only benefit of Lua ime is the ease at which it embeds in C++. Beyond that, it's a nightmare.
Its greatest improvement over vanilla Lua is better sandboxing. Lua is just pain in the ass to embed safely. So many footguns.
On the other hand, if the article is saying that it deserves to be more highly rated in the non-embedded space, I think I would need more than a handful of syntactical niceties to explain why it deserves to be in the same conversation as Python, Ruby, etc...
[1] https://github.com/rochus-keller/Luon/blob/master/Readme.md [2] https://news.ycombinator.com/item?id=42413343
This article is fairly light on detail, only mentioning a couple table footguns, but there are MANY more. (I also disagree with it being "accessible even for beginners", but I'll stick to objective things.) For starters, the size operator `#` doesn't work on map-like tables, and there's no easy way to copy part or all of a table, or to serialise one for printing.
Lua doesn't have `switch` or even `break`/`continue`. Though it added a `goto`—years after we collectively realised that's an antifeature. You can use `and`+`or` as a ternary, but you need to remember that it works differently with bools and that `nil` is falsey ofc. And `0` is truthy. Using a variable which hasn't been declared yet or is otherwise out of scope gives `nil` rather than an error. In fact most logic errors are SILENT (yay dynamic typing), and some syntax errors are raised far from the actual cause. `<const>` isn't.
Before Lua 5.3, all numbers were floats. The patterns used for `string.match` look superficially like RegEx but they are not, despite them predating Lua. The stdlib is woefully lacking, with the official documentation seeming to taunt you with examples of how several common functions could be implemented, but it's left to you to copy them into your projects.
So yeah, Lua might be small and quaint, but that's only because so much is not included "in the box", and what is included is no good.
I think it's clear in retrospect that it makes more sense to keep lists and maps separate even in a high-level dynamic language that doesn't care much about perf, and any purported gains in simplicity from having a single data structure are illusory because of footguns. But perhaps it wasn't so clear before we had enough experience with this kind of design.
Then again, Python had a clean separation between lists and dicts from the get go, and it predates Lua as well as PHP and JS...
Lua is mostly used inside games, and the JIT compilers for it are said to be pretty good. (I haven't tried this yet, but will have to soon.)
But seriously, they do have some interesting quirks which make sense if you think of it (and maybe Lua as a semi-contemporary got its ideas from there), specifically, hashes are pairs of keys (even/odd) in a flat array... and of course objects are just key/values, but with some values being names and functions...
When I last used Lua professionally (10 years ago) I did discover some foot-guns related to the handling of NULL / nil in the LuaSQL module. The Lua array length function was based on nil-termination, so a NULL value in a column could cause issues. Maybe this has been fixed by now?
Not necessarily, plenty of editors have scripting options that aren't going to win any performance price.
Also VI architecture goes back to the days of single digit MHz.
In other languages these would require much more effort.
[1] https://github.com/spring/spring or https://github.com/beyond-all-reason/spring (yay for forks!)
[2] Preserving the order of iteration on items in a hash table after deserialization and further manipulations.
I would like to see more minimalist scripting languages with modern sensibilities replace it.
It comes with batteries but they only fitt into one brand well. Compare this to eg golang or Python.
Not exactly mainstream, but so simple and elegant.
I wrote this post https://andregarzia.com/2021/01/lua-a-misunderstood-language... some time ago and it kinda touches similar points as the OP. Read on if any of yous want to see yet another person talking about Lua.
coming from mostly python, i miss a robust stdlib.
Both Lua and Fennel are tiny languages. You can skim all the documentation in a short evening. Fennel does not even have its own standard library, so keep the Lua documentation close at hand.
I appreciate the simplicity of both Lua and Fennel. There are some ugly parts, but all languages have those.
Janet (later project by Fennel's creator) is a nicer language that is also portable and embeddable, and has a nice standard library, but Fennel runs anywhere Lua runs. Janet is great just on its own for scripting and not sure I would want to use Fennel for that anywhere Janet is available. But as a way to script games in e.g. Löve2D it seems like an excellent choice.
As a learning experiment, I was able to use the fennel website to convert my Hammerspoon config from lua to fennel, but I still struggle to add new functionality. Certainly have not delved into macros, but this is where much of the power lies AIUI.
No item.length propery.
You might decide to count with foreach(), but that will terminate when it encounters the first value in the array that is NULL.
It's a bad idea to think of Lua tables as arrays. They're maps/dictionaries with some extra semantics to allow them to be used similarly to arrays.
True only if the programmer doesn’t take the effort to understand the difference between ipairs and pairs.
We have huge Lia codebase for game client and game server shared code all written in typescript. Our developers only see lua if something does not work and it’s not often.
And I would argue that a lot of the staying power of World of Warcraft came from the Lua based addons democratizing the UI. Blizzard co-opted a lot of features from best of breed addons back into the base game. (That and negative pressure in the economy to keep from having every new character wielding the third best sword in the game due to deflation, but that’s a story for a different thread)
It's a language I've created that has C-like syntax, the implementation is a single .c file. It's unique feature is an ability to arbitrarily extend the syntax (in fact classes and the type system is implemented as a library). This can be used for specific needs of your project.
It has also an ability to have time limits for execution and is able to reload scripts in place for live code updates. It also has a JIT by default.
More information here: https://www.fixscript.org/blog/introduction
The zero thing is a historical quirk due to pointer arithmetic.
Many people love Lua, so I suspect there is a good reason for this.
Its a trivial difference that does not matter.
- }, like C: Lua uses this to terminate tables. You could use the same token for both purposes (Perl5 does) but the reduced redundancy comes at some cost to both readability and syntax error reporting, which are especially important in languages aimed at driveby programmers, like Lua.
- ): Lisps generally just use ) to terminate every construct, reducing redundancy further.
- different end tokens for each construct, like Ada and the Bourne shell (endif/end if/fi): improves readability and error reporting further, but requires more reserved words and is wordier.
- end, like Lua: intermediate among the above choices.
- indentation, like Python: seems to work well enough in Python, and especially helps beginners who are therefore not permitted to indent their code incorrectly, and adds a maximum amount of redundancy for opening and closing constructs (every line is explicitly tagged with its syntactic depth) without reducing readability. However, this aspect of Python was novel when Python was introduced, remains controversial, and has not been widely adopted in new languages.
A funny thing about Lua's syntax is that it's not newline-sensitive, but semicolons are optional. This is valid Lua:
function x(a, b)f(a)f(b)g = a b = a end
Wherever two things are juxtaposed that can't be part of a single statement, it tries to parse them as two statements. This syntactic looseness means that some syntax errors that would be easily detected in semicolon languages aren't easily detected in Lua. This increases the penalty for removing additional syntactic redundancy.I thought I read somewhere that Lua was meant for use by non-programmers, so the "end" would be easier to type and read than curly brackets.
Presumably they thought the more verbose syntax would be more readable.
Modula changed it so that all structured constructs implicitly introduced a compound statement, and terminated one where appropriate (e.g. on `else`), but this still required some end marker for the last one.
If the question is why `begin` and `end` and not `{` and `}` - for one thing, Algol was designed to be the language you'd write reference implementations of algorithms in for publication, so readability was more important than terseness. For another, it predates ASCII, and not all charsets in use at the time even had `{` and `}` as symbols.
I use this for a project of mine where I embed a Lua interpreter where I made several additional Lua modules available.
OBS was one stop along my journey to get there, as there's some Lua and Python scripts that achieve something similar, but not quite exactly what I wanted with multi-screens, so I began trying to modify the script to do what I needed it to do.
In the end I'm going with something called "SparkoCam" which achieves this easily with no hassle, but it costs $40. Since $40 is less than what an hour of my time is worth, it was kind of a no-brainer to abandon the hassles of learning Lua (a many hour detour) and fiddling with OBS Studio to get what I wanted. I have no real use for Lua outside of this, so it would be time wasted on this one project.
This alone should underrate the language even more so that no poor user is exposed to this kind of nonsense
There are some places where Lua has surprisingly poor performance. However, in almost everything else, it's one of the fastest around. Luajit is still one of the fastest JITs in existence.
In the real world, rather than constraints of a benchmark, Lua tends to be on-par from what I've seen. When you're not cold-cacheing everything, its memory use is lower. But it's standard library is ISO-C, and that's about it. No batteries. And the others beat it soundly there, whilst being close to the same performance. Which will mean they will just about always win out.
[0] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
[1] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
Now made a little easier for your comparison
https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
Btw. it seems notable that Lua 5.4.6 is about 40% slower than Lua 5.4.1, see http://software.rochus-keller.ch/are-we-fast-yet_Lua_results....
Here are the results: http://software.rochus-keller.ch/are-we-fast-yet_lua_results...
Overall, the JS/V8 version was twice as fast as the Lua/LuaJIT version. LuaJIT was only faster in the Mandelbrot benchmark. The recent PUC Lua version 5.4.1 is even ten times slower than the V8 version at the time. More recent V8 will likely be even faster. The results also demonstrate that one can get a pretty false impression when just looking at single microbenchmarks.
I'm maybe not the best person to write this because I'm not really a huge Lua fan. Though on projects with others I'll use whatever language they're using, left to my own devices, I probably write about 3× as much JS as Lua, about 32× as much Python as Lua, and about 20× as much C as Lua.
But Lua has some really nice attributes.
It's a pretty reasonable very-high-level language comparable to Python, JS, Groovy, Perl, or Clojure. You can get a lot done in very little code. You don't have to crank out piles of boilerplate the way you do in Java or sometimes C. It comes with concise and readable syntax, flexible data structures, garbage collection, strong dynamic typing, list structure, hashmaps, dynamic method dispatch, inheritance, operator overloading, cooperative multithreading, eval, exception handling, namespaces, closures with lexical scoping, etc. Its documentation is first-class. Most of this is pretty much standard for the dynamic-language category nowadays; there's not much special in here to pick one language from the category over another.
Lua's biggest advantage is that LuaJIT is motherfucking alien technology from the future. I wrote a straightforward escape-time fractal renderer in https://gitlab.com/kragen/bubbleos/-/blob/master/yeso/mand.l... with a dynamic function invocation in the inner loop (to switch between fractals) and LuaJIT's trace compilation just compiles that right out. There's also a real-time animated raymarcher in that directory in LuaJIT, though the framerate and resolution are pretty bad. You can literally just write high-level code at like a Python or JavaScript level and, as often as not, it just goes faster than C. (Sometimes it doesn't. JIT compilers are unpredictable and sometimes disappointing.)
Now, probably somebody is going to reply to this and tell me that everybody runs JS in JIT compilers too, so this isn't a real difference. Unfortunately that's horseshit. I'm sorry, but V8 and SpiderMonkey are just not in the same league, though not because their implementors are less brilliant. They're just attempting a much harder job, because JS is a much hairier language. Consequently their performance is an order of magnitude worse.
(Even regular non-JIT Lua is a lot faster than most other very-high-level languages, almost to the level of JS JIT compilers.)
LuaJIT's C FFI is also just ... there's no expression in English sufficiently emphatic to express how great it is. The expression I'd normally use translates literally as "it's a very child of ten thousand whores," which hopefully conveys some of the emphasis if not the content. Here's the FFI binding for Yeso, the graphics library used for the above fractal demo: https://gitlab.com/kragen/bubbleos/-/blob/master/yeso/yeso.l.... Yeah, just by pasting the relevant lines from a C header file into your Lua program, you can call C code from a shared library as if it were Lua code. It's almost as easy as calling C code from C++!
After getting screwed over (along with everybody else) by the Python community's gratuitous and ongoing breakage of backward compatibility, coupled with public shaming to persuade more people to break backward compatibility, the Lua community's approach to backward compatibility is looking pretty appealing. The vibe is that old versions of the language live forever, and you should keep using them instead of upgrading, but new versions usually aren't compatible. However, it's much easier to make a Lua library compatible with every existing version of Lua 4 and Lua 5 than to make a Python library compatible with any version of both Python 2 and Python 3.
Being small, fast, and implemented purely in portable standard C makes it practical to run Lua even on larger microcontrollers like the ESP8266, even when they use weird CPU architectures. Even interactively. https://github.com/nodemcu/nodemcu-firmware
LuaJIT is much less portable than PUC Lua, and also more than twice the size, about 580kB on my machine. This is still orders of magnitude less than something like Python or the JRE.
Lua is a little more annoying than Python or node.js to develop interactively with, but it's overall pretty comparable. For code I'm writing myself (as opposed to getting from a library) the Lua version usually ends up being just a little bit bigger than the Python or JS version. Lua is more bug-prone, though, due to a number of language design flaws.
Lua is a very newbie-friendly language, I think even more so than JS or current Python. It's very successful in systems aimed at beginning programmers, like Minetest (Luanti), World of Warcraft, and LÖVE2D. The barrier to entry is very low; you don't have to be learning the GUI of a new IDE at the same time that you're learning the programming language and, possibly, how to program. The standard library is small rather than overwhelming. And the download for standalone Lua development is tiny.
In which case, the regularity of lua's semantics come to the rescue. It's much easier than most to develop an alternative syntax which compiles down to it.
Notice I said "alternative syntax", and not "other language which transpiles to lua". Because in the case of moonscript (or yuescript) or fennel - you're still writing lua. Using lua semantics. With no runtime overhead.
Yes, you get this to varying degrees in other ecosystems (javascript et al). But it seems to me that the friction within lua is even lower. Once you buy into "closures all the way down" & "use tables for everything", then no matter what surface syntax you're designing, it's still, pretty much, lua.
- https://github.com/mingodad/ljs
- https://github.com/mingodad/ljs-5.4
- https://github.com/mingodad/ljs-5.1
- https://github.com/mingodad/ljsjit
- https://github.com/mingodad/raptorjit-ljs
- https://github.com/mingodad/CorsixTH-ljs
- https://github.com/mingodad/CorsixTH-ljs - https://github.com/mingodad/ZeroBraneStudioLJS
Underrated? Eh... it does stuff better than others but that doesn't mean much.
We really don't learn.
Still not a fan but it's nice that it has the facilities.
I think there's a lot more to language choices than syntax and weird behaviors like that, it's more about how you run/implement the language and the ecosystem and things like that. Think about when people say javascript is a bad language because if you add together two empty arrays you get the number 0 or something. Or think about how you might be annoyed if you had to use a desktop app written in PHP (yes people do this) because the author likes PHP..
In theory I was okay with the indexing. In practice it was an absolute PITA.
Plus there were other numerous papercuts; it felt like Lua was actively fighting me every step of the way. I went from liking Lua to absolutely despising it.
In the end I switched to C++11, which amazingly was much more productive. As in night-and-day difference.
https://github.com/nodemcu/nodemcu-firmware
Based on Lua 5.1.4 or Lua 5.3 but without debug, io, os and (most of the) math modules
As expected, they didn’t bother to support all versions and probably were done with it after 5.4 announcement.
Lua's philosophy, as far as I can get it, is to be minimalist and allows for emergent features. You want arrays? We don't natively support arrays, but you can use tables as arrays. You want classes? You can build your own class system or prototype system using tables. You want to implement anything else? Well, we've got some metaprogramming for you. The problem with this approach, is that you don't end up with one language, but with hundreds of different and confusing dialect, where each programmer does their own thing, this is how JavaScript used to be with regards to concurrency and classes, and it wasn't good.
The other issues that arise from this philosophy is that Lua lacks a lot of ergonomic features which cannot easily emerge out of its simple tables+closures+coroutines foundations. There are no type hints (like Python got), const (like JavaScript), default argument values, array/table destructuring, try/catch blocks, string interpolation, foreach loops, immutable values, etc.
These features sure add a lot of complexity to the language implementation, but they are extremely valuable in saving programmer time and cutting down bugs. Lua takes the Worse Is Better approach here and saves implementation complexity by pushing it down to the user. If this complexity doesn't exist in the Lua interpreter it doesn't disappear: the programmer has to deal with it by making bugs more likely and debugging harder.
Lua might be a worthwhile trade-off where the choice is between embedding Lua or Embedding nothing, since other languages are harder to embed. Or in industries where it's deeply established like gaming. But I cannot say it's a language I enjoy programming, and I don't see any reason to use it in a non-embedded context. Apparently no one else does, but that's not because Lua is underrated.
> Lua's philosophy, as far as I can get it, is to be minimalist and allows for emergent features
IMO this philosophy only shines in languages with macro, e.g. lisp family.
WOnder how they compare?
* (If the target hardware can afford V8/node runtime, obviously)
If it doesn't, I'd still use Duktape or QuickJS over a Lua engine
Mostly functional cross-platform animated ui toolkit based on cairo/pango/gdk. Never published, probably still have a repo. I was proud of it cause it was better than gtk (if you ignore gc-related stutters).
A backend for a boring accounting-related project.
I also have developed an active-record like platform using Lua, but dropped the idea eventually. It was a repl-devenv-like thing with “active” ui, data, transparent networking, etc.
In product-ish category that’s all probably. Not much of it were actual products, as in portfolio sense.
I simply don't see any reason not to use it .. for everything.
I've built games with it. I've built GUI's with it. I've built scientific analysis tools with it. I've used it as a total solution for previously-impossible embedded problems. I've used it for desktop applications and mobile apps. I've automated entire production lines with it. It has huge appeal as a general purpose software development ecosystem - but those who know how to do this, tend to keep it to themselves for some reason.
It can do everything.
There's just nothing I want to do with computers that I can't, legitimately do, with Lua - or its VM. Or, LuaJIT. Or, all of the above and a bit of luarocks built in. Throw some luastatic in while we're at it, bundle it all up as a .deb, and nobody needs to know a thing.
Its just so versatile. I hope someone builds an OS around it some day, making everything and anything the user wants to do, accessible through a Lua module, or some smart libffi wrappers around a handful of .so's, somewhere ..
In my retirement, I'll take antirez' LOAD81, turn it into a proper Lua editor, add integration with luarocks and luastatic and maybe a shit-tonne of IPFS, and then there won't be any need, ever again, to deal with anything else.
I've tried various other Lua debuggers over the years, these are useful:
https://github.com/slembcke/debugger.lua
https://slembcke.github.io/DebuggerLua
You can also use the debug module:
https://www.tutorialspoint.com/lua/lua_debugging.htm
Otherwise, I just use gdb if needed (JIT/FFI stuff) and printf's.
However, I don't really need to debug much these days - more just testing. Its also kind of important to have automated tests and run them frequently with every major change.
Another thing I use frequently in my Lua projects, which I think cuts down a lot of the common bugs that might require debugging, is the lua_enumerable.lua utility module:
No first class support for OOP, you got to use tables instead.
No "continue" keyword to go to the next iteration of a loop. So you have to use some ugly elseif.
No try catch!
It's a dynamic language so everything goes. A function will return nil or a number or whatever you wish. A function can also returns any number of variables, so obviously some moron will abuse it: looking at you Blizzard intern (you must have been an intern right?) that thought returning 20 variables on the combat event log was a good idea. Ever heard of tables???
The LSP can't do miracles when there are almost no rules so auto-completion is more miss than hit, which is quite shocking in 2024. The IA is helpful sometimes but create subtle bugs some other times. A simple uncaught typo will create a hard to find bug (yesterday copilot auto-completed myObject.Id instead of myObject.id, there went 20 minutes of my life trying to find why some code that was running solidly for weeks was now failing silently).
So all in all, Lua is fine to write small imperative scripts of a few hundred loc. Anything bigger and more complex you should run away IMHO.
I never realized C# was such a well designed language until I tried Lua.
I think it just doesn't have much competition in the "embeddable languages" space unfortunately.
As another commenter said, JavaScript is probably the best choice in most cases today.
There's a list of options here but most of them are not really good options usually:
https://github.com/dbohdan/embedded-scripting-languages
E.g. Python would be a terrible choice in most cases. Awk? Some of them aren't even general purpose programming languages like Dhall.
If you narrow it down to embeddable-like-Lua there are hardly any.
You get meta tables instead which allow you to trivially write an OOP system that perfectly fits your needs. Yes there is more than class-based OOP.
And I mean trivially: https://lua-users.org/wiki/SimpleLuaClasses
For some people not being locked into one form of OOP or any OOP at all is a huge advantage.
> No "continue" keyword to go to the next iteration of a loop. So you have to use some ugly elseif.
Yeah, that sucks. Thankfully Luau fixes that and basically all annoyances I have with Lua. https://luau.org/syntax
> I never realized C# was such a well designed language until I tried Lua.
Lua and C# have fundamentally different design goals. C# sucks for embedding, you have to include a huge runtime. Godot still hasn't figured out how to support C# on web builds. Lua values minimalism and flexibility. That obviously means it lacks some quality of life features of bigger general purpose languages.
Language design is about trade offs.
This blog has just three posts, none with significant content. I was curious how this article even wound up on HN, until I realized that it was submitted by the blog's author: https://news.ycombinator.com/user?id=nflatrea
class MyTable is export {
has @.data;
multi method new(@data) {
$.new: :@data;
}
method render {
table :border<1>,
tbody do for @!data -> @row {
tr do for @row -> $cell {
td $cell } };
}
}
[in defence of raku, while it truly is marmite wrt Ruby / Python, there is a very loyal core of smart people that love the freedom and expressiveness it offers]