Lua is so underrated

(nflatrea.bearblog.dev)

286 points | by nflatrea22 hours ago

61 comments

  • antirez9 hours ago
    Lua is great to embed, but for Redis I picked it as a scripting language with many regrets because I don't like the language itself: it looks almost like designed to have some kind of friction compared to what you expect to write in a language that is at the same level of abstractions. Little design decisions here and there that cumulate into a language that is somewhat hostile (for me, at least). Yet, it is so fast, easy to integrate, small footprint and reliable, that many times it is still the way to go. Btw, the hostility is present even at C-API level, with the stack approach: it's not rocket science but more cognitive load than needed. Even if I was exposed in the past to stack languages like FORTH, I still had to do some mental gym while writing the bindings. Why?
    • ufo4 hours ago
      Have you seen the Passing a Language Through the Eye of a Needle article, from the Lua authors? It talks about how some of unusual the design decisions such as pcall error handling and the stack are tradeoffs in favor of "embeddability".

      https://queue.acm.org/detail.cfm?id=1983083

      The main purpose of the stack is to facilitate garbage collection. The GC can find out what Lua objects are currently being manipulated by the C client. The price is that the Lua API can never expose a "Lua object pointer" to C. Many other scripting languages expose such pointers, but then must also expose ways to manage their memory. For example, in Python the user of the API must explicitly increment and decrement reference counts.

    • astrobe_5 hours ago
      A historical thing:

      > Traditionally, most virtual machines intended for actual execution are stack based, a trend that started with Pascal’s P-machine and continues today with Java’s JVM and Microsoft’s .Net environment. Currently, however, there has been a growing interest in register-based virtual machines (for instance, the planned new virtual machine for Perl 6 (Parrot) will be register based). As far as we know, the virtual machine of Lua 5.0 is the first register-based virtual machine to have a wide use.

      The API directly reflected the (previous) internals of the VM, I guess [1].

      [1] (pdf) https://www.lua.org/doc/jucs05.pdf

      • ufo4 hours ago
        The new register-based virtual machine still uses the same stack under the hood. The only difference is how instructions are encoded in the bytecode. By default they can read and write to any position in the stack as opposed to pushing and popping from the top. The Lua stack API already does some of this, because several operations take stack indices as arguments.
    • ck457 hours ago
      Did you consider any other languages, like TCL? If yes, do you remember the pros and cons of that set of languages?
      • progre5 hours ago
        Was the first version of Redis not in fact written in TCL?
    • matheusmoreira7 hours ago
      > the hostility is present even at C-API level, with the stack approach

      > Why?

      I suspect it's because implementing control mechanisms is much easier when you reify the language's stack. Especially advanced ones such as generators and continuations which require copying the stack into an object and restoring it later. Making that work with the native stack is really hard and many languages don't even try.

      It also makes garbage collection precise. Lua values in C variables would be placed in registers or the native stack. In order to trace those values, Lua would require a conservative garbage collector that spills the registers and scans the entire native stack. By managing their own stack, they can avoid doing all that.

    • peutetre5 hours ago
      WebAssembly is a nice way to go for such things in future. You provide the interface, their wasm code runs in a sandbox, and your users get to write their plugins in whatever language they like.
  • transpute22 hours ago
    If you like Lua, see Terra, https://terralang.org/

    > Terra is a low-level system programming language that is embedded in and meta-programmed by the Lua programming language.. Like C/C++, Terra is a statically-typed, compiled language with manual memory management. But unlike C/C++, it is designed from the beginning to be meta-programmed from Lua.. In Terra, we just gave in to the trend of making the meta-language of C/C++ more powerful and replaced it with a real programming language, Lua.. Terra programs use the same LLVM backend that Apple uses for its C compilers. This means that Terra code performs similarly to equivalent C code.

    From Pat Hanrahan's research group at Stanford, https://amturing.acm.org/award_winners/hanrahan_4652251.cfm

      Hanrahan and his students developed Brook, a language for GPUs that eventually led to NVIDIA’s CUDA. The prevalence and variety of shading languages ultimately required the GPU hardware designers to develop more flexible architectures. These architectures, in turn, allowed the GPUs to be used in a variety of computing contexts, including running algorithms for high performance computing applications, and training machine learning algorithms on massive datasets for artificial intelligence applications.
    
    Papers: https://terralang.org/publications.html
    • brabel10 hours ago
      "I am here to warn you, traveler, that Terra sits on a throne of lies. I was foolish. I was taken in by their audacious claims and fake jewels. It is only when I finally sat down to dine with them that I realized I was surrounded by nothing but cheap plastic and slightly burnt toast."

      Source: https://erikmcclure.com/blog/a-rant-on-terra/

    • emmanueloga_12 hours ago
      Both Terra and Extempore [1] implement the same basic idea, which is provide a way to generate performant machine code on the fly but keeping a language surface that feels like you are working with a dynamic language (Lua or Scheme correspondingly). I think Scopes [2] could be mentioned here too. The concept is really cool and different from JIT compiling in that the developer is given the tools to generate machine code, as opposed to some "magic" JIT compiling happening opaquely behind the scenes.

      That being said, I don't think either of these languages is practical for real world projects for several reasons: lack of (or low quality) documentation, bus factor (lack of developers), tiny communities, lack of tools, lack of libraries, etc etc etc.

      --

      1: https://extemporelang.github.io/

      2: https://sr.ht/~duangle/scopes/

      • transpute11 hours ago
        > I don't think either of these languages is practical for real world projects

        Maybe not yet, but prior work by this research team eventually lead to CUDA. Terra may be useful as a productivity DSL for some high-performance computations, e.g. listed in their papers:

          our DSL for stencil computations runs 2.3x faster than hand-written C
          our serialization library is 11 times faster than Kryo
          our dynamic assembler is 3–20 times faster than Google’s Chrome assembler
        
        2021 HN thread, https://news.ycombinator.com/item?id=27334065

        > Terra is a workhorse and it gets the job done.. Having first-class code generation capabilities has been really nice, especially with built-in support for things like parsing C header files, vectorization, and CUDA code generation. The language is stable and generally doesn't have many surprises. Unlike, say, C++ or Rust, there aren't as many corners to hide odd behaviors.

    • trott19 hours ago
      > Terra programs use the same LLVM backend that Apple uses for its C compilers.

      Can it use anything else (as an option), e.g. Lua? That would be useful during development/debugging thanks to faster iteration and memory safety.

      • transpute13 hours ago
        LLVM would not be easily replaced, https://chriscummins.cc/2019/llvm-cost/

        > 1.2k developers have produced a 6.9M line code base with an estimated price tag of $530M.

        • ajb11 hours ago
          Although it's true that replacing LLVM like-for-like would indeed be expensive, I feel like you missed GP's point. To make the easier to debug, what you want is to replace LLVM with something simple, like a plain interpreter. And writing one (or adapting one) would be nothing like this cost.
          • transpute11 hours ago
            Any precedent example for an LLVM-hosted language? There are benefits from being on the same toolchain as popular languages.
            • dvtkrlbs7 hours ago
              Rust is experimenting with a codegen tool called cranelift which powers wasmtime. The plan is to use it for debug builds. There is also a backend implementation in the works using gcc jit and a dotnet backend as well
              • brabel43 minutes ago
                Zig is actually trying to get rid of LLVM for default builds, though they intend to keep it as one of the "backends" for the forseeable future.

                The problem is that LLVM is very slow (as it applies lots of optimisations) and heavy, and that makes a compiler slow even in debug mode.

                D has 3 backends, LLVM is used by the LDC compiler. Then there's DMD which is the official compiler (and frontend for all three), and GDC which uses GCC backend. D code typically compiles quite a bit faster on DMD, but the code is more highly optimised with LLVM or GCC, of course. It's a great tradeoff specially since all compilers use the DMD frontend so it's almost guaranteeed that there's no differences in behaviour between them.

        • 1oooqooq3 hours ago
          that's a fallacy that was ignored when starting llvm and looking at the already existing gcc.
    • almostgotcaught9 hours ago
      y'all realize that this thing (terra) was a phd project that is (for a long time) fully abandoned by the guy that did the phd (zach) right?

      > Terra programs use the same LLVM backend that Apple uses for its C compilers

      lol is this a flex? or is it just dated? everyone uses LLVM for all of their compilers.

      • transpute1 hour ago
        Fortunately, open-source projects can have more than one contributor. 2024 commit log includes:

          Bump CI to macOS 13
          Add AVX512 support
          Update LuaJIT
          Document AMD, Intel, Nvidia GPU support
          Add support for LLVM18
          Support SPIR-V code generation
  • opticfluorine18 hours ago
    Having just gone through the exercise of integrating Lua with a custom game engine over the past few weeks, I have to echo how clean the integration with other languages is.

    It's also worth noting that the interface is clean in such a way that it is straightforward to automatically generate bindings. In my case, I used a handful of Roslyn Incremental Source Generators to automatically generate bindings between C# and Lua that matched my overall architecture. It was not at all difficult because of the way the interface is designed. The Lua stack together with its dynamic typing and "tables" made it very easy to generate marshallers for arbitrary data classes between C# and Lua.

    That said, there are plenty of valid criticisms of the language itself (sorry, not to nitpick, but I am really not a fan of one-based indexing). I'm thinking about designing an embedded scripting language that addresses some of these issues while having a similar interface for integration... Would make for a fun side project one of these days.

    • ADeerAppeared17 hours ago
      > sorry, not to nitpick, but I am really not a fan of one-based indexing

      It is very funny how this is just the one sole criticism that always gets brought up. Not that other problems don't exist, but they're not very talked about.

      Lua's strength as a language is that it does a lot quite well in ways that aren't noticeable. But when you compare things to the competition then they're quite obvious.

      E.g. Type Coercion. Complete shitshow in lots of languages. Nightmare in Javascript. In Lua? It's quite elegant but most interestingly, effortlessly elegant. Very little of the rest of the language had to change to accomodate the improvement. (Excercise for the reader: Spot the one change that completely fixes string<->number conversion)

      Makes Brendan Eich look like a total clown in comparison.

      • taberiand17 hours ago
        To be fair, having to work with 1 based indexing when you're used to 0 would be a frustrating source of off-by-one errors and extra cognitive load
        • berkut17 hours ago
          As someone who's used Lua a lot as an embedded language in the VFX industry (The Games industry wasn't the only one that used it for that!), and had to deal with wrapping C++ and Python APIs with Lua (and vice-versa at times!), this is indeed very annoying, especially when tracing through callstacks to work out what's going on.

          Eventually you end up in a place where it's beneficial to have converter functions that show up in the call stack frames so that you can keep track of whether the index is in the right "coordinate index system" (for lack of a better term) for the right language.

          • fp641 hour ago
            Oh that’s super interesting, where in the VFX industry is Lua common? I typically deal with Python and maybe Tcl (I do mostly Nuke and pipeline integrations), and I can’t think of a tool that is scripted in Lua. But I’ve never worked with Vega or Shake or what this is/was called
        • chengiz17 hours ago
          It absolutely is. For a language whose biggest selling factor is embeddability with C/C++, that decision (and I'm being polite) is a headscratcher (along with the other similar source of errors: 0 evaluating to true).
          • usrusr16 hours ago
            It's the perfect distraction: once you start accepting one-based, everything else that might be not quite to your liking isn't worth talking about. I could easily imagine an alternative timeline where lua was zero-based, but never reached critical mass.
            • wruza16 hours ago
              Absolutely so. It’s just one obvious thing from a large set of issues with it.

              One can read through “mystdlib” part of any meaningful Lua-based project to see it. Things you’ll likely find there are: NIL, poor man’s classes, __newindex proxying wrapper, strict(), empty dict literal for json, “fixed” iteration protocols.

              You don’t even have to walk too far, it’s all right there in neovim: https://neovim.io/doc/user/lua.html

              • usrusr4 hours ago
                Perhaps that's the secret to lua's success: they got the basics so wrong that they can't fall into the trap of chasing the dream of becoming a better language, focusing on becoming a better implementation instead. Or perhaps even more important: not changing at all when it's obviously good enough for the niche it dominates.
              • kragen15 hours ago
                This is very valuable! Thank you!
            • kzrdude4 hours ago
              That's an interesting notion but I think that Lua had no competition - it was almost unique in how easy it was to integrate vs the power it gives. Its popularity was inevitable that way.
          • Mikhail_Edoshin11 hours ago
            The language actually started as a data entry notation for scientists.
        • bigstrat200310 hours ago
          I don't really blame Lua for that, though. 1-based indexing comes naturally to humans. It's the other languages which are backwards. I get why other languages are backwards from human intuition, I do. But if something has to be declared the bad guy, imo it's the paradigm which is at odds with human intuition which deserves that label, not the paradigm which fits us well.
          • xigoi5 hours ago
            1-based indexing is not any more natural than 0-based, it’s just that humans started indexing before the number 0 was conceptualized.
          • saurik10 hours ago
            https://www.cs.utexas.edu/~EWD/transcriptions/EWD08xx/EWD831...

            Why numbering should start at zero. -- Dijkstra

            • Certhas8 hours ago
              Argument by authority.

              To me 1 based indexing is natural if you stop pretending that arrays are pointers + index arithmetics. Especially with slicing syntax.

              It's one of the things that irked me when switching to Julia from Python but which became just obviously better after I made the switch.

              E.g. in Julia `1:3` represents the numbers 1 to 3. `A[1]` is the first element of the array, `A[1:3]` is a slice containing the first to third element. `A[1:3]` and `A[4:end]` partitions the array. (As an aside: `For i in 1:3` gives the number 1, 2, 3.)

              The same sentence in python:

              `1:3` doesn't make sense on its own. `A[0]` is the first element of the array. `A[0:3]` gives the elements `A[0], A[1]` and `A[2]`. `A[0:3]` and `A[3:]` slice the array.

              For Python, which follows Dijkstra for its Slice delimiters, I need to draw a picture for beginners (I feel like the numpy documentation used to have this picture, but not anymore). The Julia option requires you to sometimes type an extra +1 but it's a meaningful +1 ("start at the next element") and even beginners never get this wrong.

              That said, it seems to me that for Lua, with the focus on embedding in the C world, 0 index makes more sense.

            • MattJ1009 hours ago
              I admire Dijkstra for many things, but this has always been a weak argument to me. To quote:

              "when starting with subscript 1, the subscript range 1 ≤ i < N+1; starting with 0, however, gives the nicer range 0 ≤ i < N"

              So it's "nicer", ok! Lua has a numeric for..loop, which doesn't require this kind of range syntax. Looping is x,y,step where x and y are inclusive in the range, i.e. Dijkstra's option (b). Dijkstra doesn't like this because iterating the empty set is awkward. But it's far more natural (if you aren't already used to languages from the 0-indexed lineage) to simply specify the lower and upper bounds of your search.

              I actually work a lot with Lua, all the time, alongside other 0-indexed languages such as C and JS. I believe 0 makes sense in C, where arrays are pointers and the subscript is actually an offset. That still doesn't make the 1st item the 0th item.

              Between this, and the fact that, regardless of language, I find myself having to add or subtract 1 frequently in different scenarios, I think it's less of a deal than people make it out to be.

              • saurik3 hours ago
                In any language, arrays are inherently regions of memory and indexes are -- whether they start at 0 or 1 -- offsets into that region. When you implement more complicated algorithms in any language, whether or not it has pointers or how arrays are syntactically manipulated, you start having to do mathematical operations on both indexes and on ranges of index, and it feels really important to make these situations easier.

                If you then even consider the simple case of nested arrays, I think it becomes really difficult to defend 1-based indexing as being cognitively easier to manipulate, as the unit of "index" doesn't naturally map to a counting number like that... if you use 0-based indexes, all of the math is simple, whereas with 1-based you have to rebalance your 1s depending on "how many" indexes your compound unit now represents.

              • Certhas8 hours ago
                And the reason to dismiss c) and d) is so that the difference between the delimiters is the length. That's not exactly profound either.

                If the word for word same argument was made by an anonymous blogger no one would even consider citing this as a definitive argument that ends the discussion.

        • opticfluorine17 hours ago
          Especially when one of the language's main purposes is to be embedded in applications written in other languages (which are predominantly zero based) - and so you tend to have a lot of back-and-forth integration between these two styles that can get confusing. Even from the C API side, for example, the Lua stack is one-based but addressed exclusively from the host language which is likely zero-based.
      • sitkack17 hours ago
        Don't forget that not equals is ~=, the horror.

        The real gripes should be globals by default and ... nothing. Lua is wonderful.

        • gerdesj16 hours ago
          "Don't forget that not equals is ~=, the horror."

          I get you are taking the piss but ~= is just as logical as != being the symbols for: "not equals", if you've been exposed to some math(s) as well as ... well, ! means factorial, doesn't it?

          Syntax is syntax and so is vocabulary! In the end you copy and paste off of Stack Exchange and all is golden 8)

          • astrobe_4 hours ago
            For a language apparently inspired from Wirth, one would have expected <> (greater-or-lesser-than). But the real horror, to me, is Windows not letting one disable the so~called "dead keys" easily.
          • bingo313110 hours ago
            ! is commonly used as the unary not operator, so "a != b" makes sense as a shortcut for "!(a == b)". a not equals b.
            • xigoi5 hours ago
              But in Lua, the unary not is written as “not”.
          • ctippett12 hours ago
            I'm more familiar with CSS than I am with Lua. The syntax for the former has a very different meaning[1].

              [attr~=value]
              Represents elements with an attribute name of attr whose value is a whitespace-separated list of words, one of which is exactly value.
            
            
            [1] https://developer.mozilla.org/en-US/docs/Web/CSS/Attribute_s...
        • kps1 hour ago
          Yeah, 36 years of Unicode and it's still not ‘≠’.
      • opticfluorine17 hours ago
        It is funny, isn't it? I always wonder how the language would be perceived had they gone with zero based indexing from the start.

        I'm a big fan of Lua, including for the reasons you mention. I suspect the reason this one thing is always brought up is twofold: it's easy to notice, and it's very rare these days outside of Lua (if you consider VB.NET to be a legacy language, anyway). Other criticisms take more effort to communicate, and you can throw a rock and hit ten other languages with the same or similar issues.

      • saghm16 hours ago
        Honestly, it might almost act as a "honeypot" to give people a convenient target to complain about, which makes it easier for the rest of the language to get viewed as a whole rather than nitpicked. Sometimes I think people like to have at least one negative thing to say about something new they learn, whether it's to show that they understand enough to be able to find something or it's to be "cool" enough not to just like everything.
        • chii15 hours ago
          that would be a galaxy brain move on the part of lua.
          • saghm13 hours ago
            To be clear, I'm not claiming that I think this was the original intent for implementing array indexes like this; I'm just theorizing that this might explain why this ends up being such a fixation when it comes to discusses downsides of Lua.
      • krapp14 hours ago
        >Makes Brendan Eich look like a total clown in comparison.

        To be fair, Brendan Eich was making a scripting language for the 90's web. It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.

        • yoz12 hours ago
          Most of the blame should go to Netscape management. They didn't give Eich much time, then burst in before he was done and made him copy a bunch of things from Java. (The new language, codenamed "Mocha" internally, was first publicly announced as "LiveScript", and then Sun threw a bunch of money at Netscape.)

          IIRC, Eich was quite influenced by Python's design. I wish he'd just used Lua - would likely have saved a lot of pain. (Although, all that said, I have no idea what Lua looked like in 1994, and how much of its design has changed since then.)

          • BrendanEich9 hours ago
            https://news.ycombinator.com/item?id=1905155

            If you don't know what Lua was like then, don't wish that I'd "just used Lua".

            Other issues include Netscape target system support, "make it look like Java" orders from above, without which it wouldn't have happened, and more.

            • kragen5 hours ago
              Oh hi Yoz! LTNS! Hi Brendan!

              It sounds like you're saying Yoz got the sequence of events wrong, and that MILLJ was a necessary part of getting scripting in the browser? I sort of had the impression that the reason they hired you in the first place was that they wanted scripting in the browser, but I wasn't there.

              I don't think Lua was designed to enforce a security boundary between the user and the programmer, which was a pretty unusual requirement, and very tricky to retrofit. However, contrary to what you say in that comment, I don't think Lua's target system support or evolutionary path would have been a problem. The Lua runtime wasn't (and isn't) OS-dependent, and it didn't evolve rapidly.

              But finding that out would have taken time, and time was extremely precious right then. Also, Lua wasn't open-source yet. (See https://compilers.iecc.com/comparch/article/94-07-051.) And it didn't look like Java. So Lua had two fatal flaws, even apart from the opportunity cost of digging into it to see if it was suitable. Three if you count the security role thing.

        • ADeerAppeared3 hours ago
          > To be fair, Brendan Eich was making a scripting language for the 90's web.

          He was, and he doesn't deserve the full blame for being bad at designing a language when that wasn't his prior job or field of specialization.

          But Lua is older so there's this element of "it didn't need to be this bad, he just fucked up" (And Eich being a jerk makes it amusing to pour some salt on that wound. Everyone understands it's not entirely serious.)

        • bigstrat200310 hours ago
          > It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.

          Which remains one of the most baffling decisions of all time, even to this day. Javascript is unpleasant to work with in the browser, the place it was designed for. It is utterly beyond me why anyone would go out of their way to use it in contexts where there are countless better languages available for the job. At least in the browser you pretty much have to use JS, so there's a good reason to tolerate it. Not so outside of the browser.

        • BrendanEich9 hours ago
          "Silicon Valley" is not an actor (human or organization of humans) that decided any such thing. This is like saying a virus decides to infect a host. JS got on first, and that meant it stuck. After getting on first, switching costs and sunk costs (fallacy or not) kept it going.

          The pressure to evolve JS in a more fair-play standards setting rose and fell as browser competition rose and fell, because browser vendors compete for developers as lead users and promoters. Before competition came back, a leading or upstart browser could and did innovate ahead of the last JS standard. IE did this with DHTML mostly outside the core language, which MS helped standardize at the same time. I did it in Mozilla's engine in the late '90s, implementing things that made it into ES3, ES5, and ES6 (Array extras, getters and setters, more).

          But the evolutionary regime everyone operated in didn't "decide" anything. There was and is no "Silicon Valley" entity calling such shots.

          • ADeerAppeared3 hours ago
            > "Silicon Valley" is not an actor (human or organization of humans) that decided any such thing.

            Oh come on, you understand full well that they're referring to the wider SV business/software development "ecosystem".

            Which is absolutely to blame for javascript becoming the default language for full-stack development, and the resulting JS-ecosystem being a dysfunctional shitshow.

            Most of this new JS-ecosystem was built by venture capital startups & tech giants obsessed with deploying quickly, with near-total disregard for actually building something robustly functional and sustainable.

            e.g. React as a framework does not make sense in the real world. It is simply too slow on the median device.

            It does, however, make sense in the world of the Venture Capital startup. Where you don't need users to be able to actually use your app/website well. You only need that app/website to exist ASAP so you can collect the next round of investment.

        • peutetre5 hours ago
          I don't think JavaScript will replace all application development in the future. WebAssembly will displace JavaScript. With WebAssembly you can use whatever language you like and achieve higher performance than JavaScript.
        • thaumasiotes12 hours ago
          Wasn't the language he was making for the web Scheme?
          • hollerith12 hours ago
            No, Scheme was defined in 1975.
            • thaumasiotes12 hours ago
              So? That doesn't stop Brendan Eich from putting it in a web browser 20 years later.
              • hollerith12 hours ago
                No, it does not. I see I misunderstood your q.
    • m46311 hours ago
      > I'm thinking about designing an embedded scripting language that addresses some of these issues...

      https://xkcd.com/927/

      ;)

    • zogomoox18 hours ago
      if you fix the one based indexing you should call yours KT@ or Kt`
  • Levitating22 hours ago
    I can't say I appreciate Lua any more after reading this. It only reinforced my annoyances with it.

    It's been a while since I wrote any Lua but I didn't think it was intuitive at all. Why are arrays tables, why are they nil terminated and why do they start at 1.

    I don't think it's underrated either. It's very popular for writing scripts in (older) game engines.

    • jillesvangurp5 hours ago
      > It's very popular for writing scripts in (older) game engines.

      Exactly, it seems popular in places where the alternative is even less usable (e.g. write your own C++ plugin or thing). For that it's fine.

      For example, there is a lua plugin for x-plane and it's great. I've done some minor things with it even. But that didn't make me a fan of the language. It's fine for simple things. But a bit limited.

      The x-plane API is pretty low level and written in C++. So something that sits on top and is a bit more accessible is great. There are a lot of lua scripts available that you can run in there that are pretty nice. But the more complicated ones get pretty hairy quickly. Makes you long for something more structured.

      I don't have the time, but I did consider working on adding a websocket API at some point to make it easier to integrate with from e.g. a web page or some kind of standalone process. I think they are now planning to add that feature themselves so there's no need to work on this anymore. And they have a low level UDP API currently that exposes all sorts of things. But a websocket API with some good client libraries for different languages would make a lot of sense.

    • int_19h16 hours ago
      > Why are arrays tables

      For the same reason why (standard) arrays are associative in PHP and JS, as well - because, from a certain perspective, a regular array is just an associated array with consecutive integers as keys, so if you're trying to keep things simple, it seems like a straightforward unification.

      > why are they nil terminated

      It follows from them being associated arrays, and Lua's pervasive use of "nil" meaning "this thing is missing".

      > why do they start at 1

      Because humans generally start counting from 1.

      • cxr21 minutes ago
        It is a misconception that a JS array is "just an associated [sic] array with consecutive integers as keys". Even the earlier (most readable) editions of the ECMA spec have a whole set of carve-outs that belies the claim.
      • dahart16 hours ago
        > Because humans generally start counting from 1.

        I’m not parent, but for me the question is not literally why did they choose 1-based indexing. That’s easy to answer. The question is really: why haven’t they realized by now that the disadvantages of 1-based indexing far outweigh the benefits when it comes to programming, and changed it? Backward compatibility is a valid reason, but lots of other people have tried 1-based arrays and reverted to 0-based once it becomes clear that 1-based arrays causes more problems than it solves. Even humans who count starting from one will soon want to use 0-based the moment they need to add two array indexes together; we don’t want to have to start from 2 in that case, right? Early versions of the famous Numerical Recipes in C started with 1-based arrays, but removed them, and got slightly better. :P

        • leephillips15 hours ago
          Yes, that’s surely why Fortran never found any widespread adoption and remained an obscure, niche language.
    • kristopolous22 hours ago
      a bunch of that comes from awk.
    • matheusmoreira22 hours ago
      > Why are arrays tables

      I might be able to provide at least some insight into that.

      One of the things I found while implementing my language was that the ordering of elements in a hash table is essentially random. Literally random if you mix in some randomness into the hash function's initial state in order to harden it against collision attacks.

      Many languages guarantee insertion ordering. It's a useful property to have, makes testing easy and deterministic and it's intuitive for users. A simple way to implement that is to put a perfectly normal array of values into the hash table, and then have the buckets which the keys hash into contain indexes into that array. That way the indexes of values remain stable no matter what the keys hash to.

      So a sufficiently advanced hash table will have an array of values inside it. I suppose some language designers took one look at that situation and decided to just merge both types already by using hash tables for everything. For example, PHP also did that. Associative arrays, they call it.

      • kragen18 hours ago
        CPython's hash table (dict) evolved to have an array of values indexed by a hash table of indices for performance reasons, but later decided to guarantee the insertion ordering that this permitted for iteration.

        "Associative array" is just the term used for hash tables (dicts) in Awk, bash, Tcl, and, most relevantly for PHP, Perl4. It doesn't specifically mean that the language lacks a separate container type for small-integer-indexed arrays. It just means that that particular container type can be indexed by something other than small integers, usually strings.

        • wruza16 hours ago
          Yes, this name is basically a reflection of the fact that a hash table is an array, indexes of which get calculated by `hashof(key) % arr.length`.

          It doesn’t mean HTs are the arrays which should be used to store seqint-indexed values.

          • kragen16 hours ago
            No, that's an implementation detail which isn't actually true of the associative arrays in all of these languages. "Associative array" is the name for the interface, not the implementation. In theory it could be a hash trie or something, although I'm not aware of any implementations of any of these languages that uses anything but a hash table (which is, yes, fundamentally dependent on using an array).
        • matheusmoreira14 hours ago
          > CPython's hash table (dict) evolved to have an array of values indexed by a hash table of indices for performance reasons

          In case anyone is curious about it:

          https://mail.python.org/pipermail/python-dev/2012-December/1...

          > In addition to space savings, the new memory layout makes iteration faster.

          > Currently, keys(), values, anditems() loop over the sparse table, skipping-over free slots in the hash table.

          > Now, keys/values/items can loop directly over the dense table, using fewer memory accesses.

          The "dense table" is the array of values. The "sparse table" is the actual hash table, only the values index into the values array.

          I thought the insertion ordering was implemented at around the same time as a result of it. The data structure ensures it pretty much for free, seemed like the obvious next step. It appears I was mistaken about that.

          • kragen4 hours ago
            They did implement insertion ordering at that time, but they didn't want to commit to adding it to the official language semantics until having enough experience to know whether they wanted to back out the change to the implementation which provided that guarantee for free.
      • wruza17 hours ago
        You described how every language without hashtable-as-an-object works. The only difference is when you address fields.

        But I fail to see the logic behind seeing this and merging array/table ideas. Feels like one of these irrational decisions you make in a dream.

        And Lua doesn’t use hash table to store arrays. Its tables have two different parts (hash, indexed array) with foggy distribution logic which reflects into “# vs ipairs” issue. This merge is purely synthetic and doesn’t project any underlying implementation idea.

        • unscaled15 hours ago
          Merging the concepts of arrays and maps was very popular in the 1980s and 1990s. I can count at least Awk, Tcl, Lua, JavaScript and PHP as languages that went with this approach.

          Looking from today's perspective, I also fail to see the logic behind it: separating arrays and dictionaries does not make your implementation more complicated than having a single backing data structure for both and taking care of all the special cases that arise from such a monstrosity.

          The best explanation I can think of is that this was just a fad nobody questioned too much. Back in the 1980s and 1990s, there was a strong distinction between "scripting languages" and general-purpose dynamic languages (like Lisp and Smalltalk). Scripting languages were supposed to be used to write small scripts to automate stuff. Type-safety was not something that any mainstream language considered back then (it was a thing in niche languages like Ada, SML and Eiffel), and speed was not a concern for scripting languages.

          A common philosophy (though I never seen it clearly stated) for scripting languages seems to have been to "be liberal and try to work with whatever the programmer does".

          So you shouldn't require to declare variables in advance (that's a lot of hassle!) or throw an error if a function is called with the wrong number of arguments.

          And what if the programmer tries to add an integer and a number written as string? Just coerce them. Or better yet, make everything a string like Tcl does. Or at the very least, don't be a spoilsport and have separate types for integers and floating point numbers!

          Ok, so what if the programmer tries to use a string index? We cannot coerce non-numeric strings! Well, why don't we just unify arrays and dictionaries like we've just unified all our number types? Now you don't have to worry about invalid indexes since all indexes are converted to strings! And if the programmers want to use 0-based, 1-based, 2-based or 42-based indexes they can do whatever they want! They can even have non-contiguous arrays or mix arrays with associative dictionaries. Total freedom.

          This thought process is completely imaginary. I suspect not a lot of thought was given to this type of flexibility at all back in the day. It was just a widely accepted assumption that scripting languages should be flexible, free and relaxed. And merging arrays and dictionaries (or "all arrays are associative arrays" as it was often called back then) was just one more idea that naturally fit into the zeitgeist, like type coercion and "assignment to an unknown variable sets a global variable by default".

          So tldr: it was a fad.

          • BoingBoomTschak4 hours ago
            > Tcl

            Not to my understanding. What is called "array" in the language is indeed associative, but lists (heterogeneous vectors, like Python's) have always been here and distinct.

            But otherwise, insightful post. Awk is also the first thing that came to mind, with its funky arrays and "undefined variables are initialized on first use to the empty string or 0 depending on the context".

    • selimthegrim22 hours ago
      The system software for newer TI graphing calculators is written in Lua and often emulated to Z80.
  • fsmv19 hours ago
    I built a chatbot with lua plugin commands and I really had an awful experience trying to incorporate it in my C++ program. I really don't understand why people say it's easy to embed because you have to do a ton of stack manipulation to do anything. Not only that but the language itself is cumbersome, frustrating, and does nothing at all to help you when you make a mistake.

    I will never use lua again and avoid any project that uses it.

    • bogwog18 hours ago
      I am in the "Lua sucks" camp. I also found it weird how clunky it is to embed, but my biggest gripe is with the language itself. It just sucks, as its barely a programming language at all. Every Lua project ends up reinventing its own class system or other language feature to get anything done.

      I've also never seen a clean (non trivial) Lua codebase before. They're all spaghetti messes and difficult to follow. On the contrary, some of the worst, least maintainable and difficult to read code I've seen has been in Lua.

      For configuration (what it was originally designed for) or for very small, non-critical stuff, then it can be a good choice.

      With that said, I know some people are very productive with it, and have made impressive things in Lua. It's just not for me.

      • garbagelang14 hours ago
        > It just sucks, as its barely a programming language at all So very true.

        This entire comment accurately depicts what it’s like to actually use Lua.

        The worst, host horrific codebase I’ve ever worked on was in Lua for some godawful reason.

      • zaik9 hours ago
        I enjoyed writing a small module for the Prosody XMPP server, which is written in Lua. Everytime I had to look into the internals I found what I was looking for pretty easily.
      • kragen17 hours ago
        > It just sucks, as it's barely a programming language at all.

        I remember when I thought like this when I was first learning C++ and thought it was awesome because of all the features built into the language. Since then, I've learned that things that are "barely a programming language at all" like Lua—where you can easily learn the whole language and then write, if necessary, reflective object-oriented effortlessly parametrically polymorphic code—are actually the best languages of all. (Such extreme liberalism does require a compensating amount of discipline, as you say.)

        Lua has its flaws (implicit nils, the nil/# interaction, default global, 1-based indexing, etc.) but minimalism is one of its major virtues.

        • wruza17 hours ago
          Then you get hold of a language that doesn’t have all this mess as a foundation and it feels like you stopped self-suffocating.

          you can easily learn the whole language and then write, if necessary, reflective object-oriented effortlessly parametrically polymorphic code

          This is all you can do in these languages. Anything above it will share the fragility and ad-hocness of this meaningless language para-olympics.

          • kragen16 hours ago
            Nope, that's completely wrong, to the point that you cannot have possibly thought it was correct. You can avoid using reflection, object orientation, and parametric polymorphism in Lua, just as you can in almost any other language, and stick to simple imperative code. And most of the time you should.
            • wruza16 hours ago
              I think you misunderstood my second paragraph. What I’m saying is that in order to do anything complex in it, you’ll have to reimplement important parts that are lacking, using all these meta/parametric things in your “mylualib”, and all this mud will drown you when you’re most vulnerable.

              Toying with it with all the free time on one’s hands is okay.

              Avoiding useful techniques is an option, but doing it when the world of saner alternatives exists feels like self-sabotaging.

              • kragen16 hours ago
                Hmm, that makes a little more sense than what I thought you were saying. I still don't think it's correct, though. Lua has those facilities already; you don't have to implement them yourself. My point is that it isn't like programming in assembly or C where you either implement them yourself or do without. When OO or reflection is the saner alternative, in Lua, you can just use it.
                • wruza15 hours ago
                  I see your point and have even shared it, but my experience with real-business Lua was miserable and actively laughed at. Which was, I must admit now, absolutely deserved.
                  • kragen15 hours ago
                    Condolences! Fortunately, my own Lua experience has entirely been real-fun Lua instead of real-business Lua.
    • Etheryte18 hours ago
      My experience has been similar, even though not as drastic. Lua is fine for small extensions and such (e.g. Hammerspoon), but a lack of any kind of a reasonable standard library really bites me the wrong way. Sure, I can write a function to log an array, but why can't I just log an array like a decent human being?
    • kragen18 hours ago
      What language did you find was easier to embed, and how does it handle the lifetime management that Lua's embedding API handles with its stack?

      The language itself may be a matter of taste. I prefer languages which, "in the face of ambiguity, refuse the temptation to guess," but certainly I've spent enough time using Perl, bash, JS, and BASIC to be able to deal with Lua's relatively mild level of DWIMminess.

      • le-mark17 hours ago
        I would use quickjs today for embedding over lua. Everyone know js nowadays and quickjs is really nice. The advantage lua has is library support via Lua rocks. Maybe someone can comment on the quickjs ecosystem?
        • dchest11 hours ago
          I have experience of embedding and modifying both QuickJS and Lua. QuickJS is not bad, but Lua is a lot easier and has fewer bugs. JavaScript is a very complicated language under the surface; I think very few people, if any, know and remember all its quirks.
        • ga2mer9 hours ago
          Amazfit's Zepp OS uses quickjs to run 3rd party watchfaces and apps in their smartwatch

          I was surprised to find out I'd been wearing javascript on my arm all this time

        • xigoi5 hours ago
          While QuickJS is nice, JavaScript as a language is much uglier and more complicated than Lua.
        • kragen16 hours ago
          It sounds like you're saying quickjs is what you'd try if you wanted to embed a language, but you haven't tried embedding quickjs yet?
        • feznyng13 hours ago
          I think there’s a fork called quickjs-ng that’s actively maintained on GH. Should be able to use it as a lib maybe with some externing.

          I’ve also had luck using it in a wasm runtime thanks to quickjs-emscripten. Good if you need sandboxing sans full process isolation.

          • kragen4 hours ago
            Thank you for sharing your experience! Wasm provides full process isolation, doesn't it? Do you mean that QuickJS provides cheaper sandboxing than wasm?
        • garbagelang14 hours ago
          Luarocks is an advantage? It’s full of thousands of variations of the same poorly copypasted code from idiotic Lua forums. All of them basically abandoned.

          Lua rocks is a wart on Lua’s piss poor existence, not an advantage. Lol.

      • int_19h17 hours ago
        Python is arguably easier since you don't need to handle non-native stacks at all; you just have a straightforward mapping of Python operations to C functions that you call with regular arguments, as usual.
        • kragen16 hours ago
          With Python's extension API, you have to carefully incref and decref Python values to ensure that you don't either leak memory or deallocate data that's still in use. That's what the Lua stack interface saves you. It's not a call stack, which seems to be what you're thinking of.
          • wruza15 hours ago
            Lifetimes and references are things to consider in Lua integration code too. See luaL_ref() rationale and why you can’t lua_pop() a string that is in use. The fact that gc doesn’t happen instantly 99.99999% of the times doesn’t save you from thinking about it.

            Not that this was a hard topic in a language where lifetimes are the main concern and business as usual. In C/etc you can’t hold anything without a ref or some sort of a lifetime guarantee. What makes it special wrt Lua/Python?

            • kragen15 hours ago
              Yeah, I didn't mean to say you didn't have to think about lifetimes at all when embedding Lua, just that the stack, weird as it is, seems to simplify that kind of thing rather than complicating it (even if you do occasionally have to luaL_ref). But if there's a clearly better alternative, I'd like to know about it. Constantly calling Py_INCREF and Py_DECREF isn't it.

              Delayed GC seems like it might add headaches for this kind of thing rather than removing them, because your bugs may take a while to show up in testing.

              • wruza15 hours ago
                Lua stack basically works as an arena if you aren’t eager to clean it.

                I guess in C one could macro hack himself a similar arena to borrow objects from python runtime.

                In C++ it’s a matter of befriending a smart pointer with ref/unref calls. It was probably done hundreds of times in all sorts of python embedding wrappers.

    • 18 hours ago
      undefined
    • wruza18 hours ago
      It’s easy to embed compared to e.g. python, js or perl, and iff you only push/call/pop and it’s all functions with primitive arguments. Otherwise it’s ~as hard.
    • 18 hours ago
      undefined
  • bvrmn18 hours ago
    It's hard to use lua for any sizeable chunk of code due to weak typing, absence of type hints and error handling. No it's not underrated in any way.

    It's funny how through the years I've started to appreciate statically typed languages and type hints (Python, TypeScript) as well. Up to my current stance: it's a disservice for a community to write hintless python or js code.

    P.S. Recently I've rewritten 1.5k cloc of JS into TypeScript. As usual there were dozens of cases of missed nulls, null attribute accesses or semantically questionable nulls.

    • DanielHB17 hours ago
      It is because type inference has become a standard feature in most languages so you don't have to type AbstractBeanCounterSingletonFactoryFactory all the time.

      Also the duck typing crowd were rebelling against polymorphism and over-used meta-programming OOP slop, not against types per se.

      • bvrmn17 hours ago
        > Also the duck typing crowd were rebelling against polymorphism and over-used meta-programming OOP slop, not against types per se.

        My argument against types in python was literally: it's hard to model my domain with static types. And we have tests!

        I felt types would severely decrease my productivity as 10x coder. Embarrassing, right?

    • thunkle11 hours ago
      I just started using lua and have types in vscode using lua annotations. https://github.com/LuaLS/lua-language-server/wiki/Annotation...
    • rstuart413318 hours ago
      You forgot to mention variables are global by default, just like javascript.
    • liendolucas18 hours ago
      > Up to my current stance: it's a disservice for a community to write hintless python or js code.

      Strongly disagree. If you are you using Python or Javascript or any other duck typing language you are supposed to know what you are doing and types are completely unnecessary. I'm amused at how the community in general praises and adopts TypeScript in so many projects. You can just write realiable JS code as it is. I have ever written any hinted Python code and everything just works as expected. If you need some guarantees then I would probably switch to some unit testing which at least for me brings more value that using types.

      • wruza7 hours ago
        You can just write realiable JS code as it is

        You can, but I can’t. I often just miss fixing some ident at a little clean up phase after prototyping yet another chunk.

        unit testing which at least for me brings more value that using types

        Not everyone writes for testing. E.g. I have large-enough ai related scripts in typescript that require types to feel safe to change. But I won’t create tests for these only to find typos and editing leftovers. Would be such a waste of time.

        Typescript types are only required in “entry points” and live their own life afterwards thanks to inference. To me it was so worth it that I stopped creating js sandboxes for little experiments and made a ts project scaffolding script to spin up a new app in a few keystrokes.

        And when you want to learn about “third party” code, types are irreplaceable. Code tells absolutely nothing compared to types. A single minute in .d.ts is worth an hour in .js.

      • jchw17 hours ago
        - Documentation is great, but documentation can fall out of date. Strong types usually can't; they are checked in both directions.

        - Tests are great, but it's hard to maintain 100% coverage that checks all of what types do. It's very possible to have tests that pass that shouldn't. Types can not only find stealthily broken code, but also broken tests.

        - Type checking gives even more instantaneous feedback than almost any test runner.

        - Most projects have multiple developers. Types don't communicate everything, but they are a reliable form of communication that is guaranteed to be up-to-date even when documentation isn't, which is great combined with good function names and good parameter names.

        - Most programs need to deal with requirements that shift, either because they evolve or our understanding evolves. Refactoring code without types is hard: if you change the return value of something, how can you be sure that all of the usages are up-to-date? What about if your refactor has been rebased a number of times before merging? I've run into a real-world production incident caused by this, when a return type changed to a tuple and it only exploded in an important Celery job that didn't have a test around the specific boundary that broke.

        "If you know what you're doing" is not enough. Programs change, and they are changed by multiple people concurrently. This is the same reason why Rust continues to grow, people can say what they want but moving errors to the left is the only way forward because even if you're literally perfect you still have to know 100% of what's going on to never merge anything that causes a problem. (But really, people are not perfect, and never will be, so better tools Will win eventually.)

        • mdaniel14 hours ago
          > - Most programs need to deal with requirements that shift, either because they evolve or our understanding evolves. Refactoring code without types is hard

          I get perverse thrills out of drawing the attention of the "pfft, types are for n00bs" crowd to the multiple authentication related CVEs in GitLab due to the "well, the parameter types are whatever they are, don't you worry about it"

          That said, I'm trying very hard to ween myself off even reading these threads, let alone contributing to them, because much like <https://news.ycombinator.com/item?id=42492508> I'm never going convince them and they for damn sure are never going to convince me

      • vrighter9 hours ago
        Code always has types. Even in dynamic languages. The difference is whether those types can be expressed in the code, or have to be kept completely in your head (or constantly having to re-figure out what they are).
      • int_19h16 hours ago
        Past a certain amount of lines of code, you might think you know what you're doing and types are completely unnecessary, but it's just not true.
      • blazing23417 hours ago
        I weep for the people who have to read your code when you're not around
      • bvrmn17 hours ago
        > you are supposed to know what you are doing and types are completely unnecessary

        You are human, you simply can't.

        I've literally found bugs in MY existing code by a mere rewriting it to TypeScript. It's the same story every single time. Even for projects with 100% test coverage.

  • pavlov22 hours ago
    My humble opinion is that most applications that need scripting should just embed JavaScript.

    It has a massive advantage in the existing ecosystem. People don’t always love it, but millions of them already know it. You don’t need to re-learn array indexing and nil termination quirks and the dozens of other things where Lua is subtly different in ways that you’ll find out at runtime.

    There are JS engines in all sizes, from tiny interpreters to world-leading JIT performance monsters. And most operating systems nowadays ship with a high-quality engine and embedding framework that you can access without installing anything, like JavaScriptCore on the Apple platforms.

    Do your users a favor and just use the boring standard language instead of making them learn a completely new one that doesn’t offer meaningful practical benefits for them. It’s probably not as much fun for you, but you’ll save time for both the users and yourself in the end.

    • Demiurge22 hours ago
      On one hand, you do have JavaScript ubiquity, on the other hand, you have the JavaScript "WTF" equality and typing conversion matrix, "WTF" scoping, and dozens of other foot guns that Lua doesn't have. Some people would rather everyone learned Lua subtle differences and wrote less buggy code. I've had to use for the web for 20 years, and if I have any choice, I would avoid it.

      Anyway, every language has some fans, and I don't mean to dismiss them, but ubiquity is not the top priority, all the time.

      • IshKebab19 hours ago
        Fortunately most of JavaScript's WTFs are avoidable (though I do wish they'd add a "really strict;" mode that actually removed them).
        • Demiurge16 hours ago
          Yeah, that would be great. I remember this was helpful in Perl, and ES6 days.
      • wruza12 hours ago
        JavaScript has very nice scoping compared to Lua.

        Js needs no forwards which in Lua turn into ugly “local foo; … … … foo = function()…end”. All functions are “hoisted” by default.

        Js also uses “export”s or “module.exports[…]” instead of returning a symbol table, which allows cross-referencing.

        Js has a function scope (var) together with local scoping.

        Js doesn’t allow accidental shadowing in the same scope. Lua: local x; local x; silently creates two variables, good luck debugging that closure.

        Js type coercion is only insane in the parts where operation itself makes zero sense. Please don’t tell me that ![0] or [] + {} makes sense to you or that you use them in other languages. Or that s+n and n+s seems safe everywhere except in js.

        Js has a ternary operator and ?? ?. ??= operator group. Lua only has that “and-or” crutch and “not X” which returns true only for nil and false.

        Js “…arr” is a first class citizen while in Lua “…” decays into its last value if you want to follow it.

        Js (normally) ordered properties system is much more convenient in dev and debug time than an unordered mess of meta-hashtables.

        Js doesn’t require you to

          {
            myPropertyHere = gotSomeObject.myPropertyHere,
            …
            IllRepeatThisAsManyTimesAsNeeded = gotSomeObject.causeWhyNot,
          }
          gotSomeObject.someNumericField = gotSomeObject.someNumericField + 1
        
        which is extremely abhorrent DX in 2024.
      • therein21 hours ago
        There is the equality gotchas on JavaScript but then you have 1-indexed arrays in Lua, which is a footgun that I know exists, I try to avoid shooting myself in the foot with it but I keep doing it.

        It is beyond a footgun, more like a footgun concealed as a shoehorn.

        Either way I find embedding a Lua VM into a project so much easier than embedding a Javascript one.

        • kragen17 hours ago
          Which JS VMs have you tried embedding? People in this thread have suggested https://github.com/Moddable-OpenSource/moddable, https://bellard.org/quickjs/, and https://duktape.org/, but I haven't tried any of them.
        • Demiurge16 hours ago
          Indexing, I give you this one... I think it was intended as some kind of a gift to a "non programmer", a choice many other older language made, by the history decided to go another way, unfortunately for Lua. Still, I haven't had any issues adjusting to this one, personally. It seems kind of like Python whitespacing, I think I forget about it.
      • leptons20 hours ago
        The only people saying "WTF" about Javascript are people who haven't taken the time to learn how to use it effectively. There are "WTF" things about every language, not just Javascript.

        Noobs are going to find the footguns in any language. Every language can be misused. I've been using Javascript over 20 years and have never thought it was any worse than any other language. Yes, it's dynamic, and yes, you can do stupid things with that dynamism, but for someone who actually knows how to use it, it's quite easy to work with.

        I recently started using Lua and I'm not impressed at all. It's just another language, it's really up to the programmer what can be done with it, just like Javascript.

        • 3a2d2919 hours ago
          This is the common arguement used by people defending languages with tons of footguns.

          If this is true, why not write everything in C++?

          Javscript includes so many footguns that people know are bad (see the billion of examples of different types being added and getting random answers). Every langauge has some difficult parts, pretending that means any amount is fine is crazy. Javascript has earned its reputation.

          • leptons14 hours ago
            >(see the billion of examples of different types being added and getting random answers)

            Almost every "footgun" people cite about Javascript is something that would never be written as working code, by any competent programmer - they are "WTF" in that the person writing the "gotcha" has to go way outside of normal programming behavior to get the "WTF" result. Most of these "WTF" examples should be taken with a grain of salt.

            If you're expecting to "WTF" about Javascript's type inference/coercion, that works according to some logical rules.

            https://media.geeksforgeeks.org/wp-content/uploads/advanced-...

            This matrix should be pretty easy to understand. Yes some things might seem weird like [[]] == false, and [1] == true, but nobody should be writing code like that, it's just illogical to do anything this way and I don't consider it to be a "gotcha". "Just because you can, doesn't mean you should" can be said about footguns in every language.

            Yes, there are "WTF"s, but just like if I were to write a C++ "WTF", it wouldn't be something that most people would be tripped up by unless they were specifically looking for a way to say "WTF" about the language.

            These "javascript sucks" internet arguments have persisted for decades, and it's always complaints about the language that are citing unreasonable examples to prove a point that a programming language has some rough edges. Every programming language has some rough edges.

            Javascript maybe has more footguns than some other languages because it is so dynamic, and the dynamism has its benefits too that other languages don't have. I do like it and I use it effectively, YMMV. People just seem to be specifically mad at Javascript because it's so popular and was made so by being the defacto standard for web browsers. It's part jealousy and part frustration that their favorite programming language isn't the one that web browsers include by default.

            • mattlondon8 hours ago
              Agreed. I think a lot of the "classic" footguns are picked up by current-day linters/AI/code-assists anyway, so even if you do somehow end up writing [[]]==false legitimately (explicitly or implicitly), you'll probably get a squiggly warning about it on your IDE immediately warning you.

              For better or worse, JavaScript is the Lingua Franca of modern programming - you have a high-performance runtime environment pre-installed on basically anything, and readily installed on greybeard boxen if you are that way inclined . It is my "go to" for all hobby projects, frontend and backend. So versatile and a genuine joy to code in.

        • int_19h16 hours ago
          Python is also dynamic, but it doesn't let you do things like (1 + "a"), for example.
        • Supermancho19 hours ago
          Lua feels and acts like a worse javascript. Yes the runtime can make the memory footprint really small, but manipulation and validation of the crappy list/table structures makes the implementations miserably large. I've only written about a million lines of it (literally), so I'm pretty set in my opinion.
        • Demiurge16 hours ago
          Okey, so the first thing everyone should learn is not to use "==" but use "===". Or, perhaps, just use TypeScript, right? I think that's what people call "effective JavaScript" these days.
          • leptons13 hours ago
            Quite a lot can be built reliably with Javascript following only "use ===". YMMV. Typescript doesn't solve much unless you use it religiously (and few people really do), it still allows for all kinds of "gotchas" if you want to write "gotchas" with it.
    • kragen18 hours ago
      Lua 5.2 is 15000 lines of portable ANSI C. On amd64 the Debian default configuration of the interpreter compiles to a 950KiB executable that strips to 224K. I think reduced configurations can go below 100K. It doesn't start extra threads, demand a thread of its own, or have tricky memory management demands. What JS interpreter would you recommend for cases where that's a good fit? Something that's less than a megabyte and can be compiled to run on weird architectures.
      • Vogtinator18 hours ago
        https://duktape.org is in a similar ballpark.
        • kragen17 hours ago
          This does look great! Unfortunately it looks like it'll be hard for me to build it; it's yet another project that got fucked over by the Python community's irresponsibility.
          • tecleandor6 hours ago
            Why is that? I think I haven't heard of Duktape before, so I don't have the background for whatever happened.
            • kragen4 hours ago
              Its build system apparently depends on Python 2.
              • rcarmo4 hours ago
                No it doesn't. People who complain about that clearly haven't actually checked - there's a warning in the old configuration scripts as well:

                * Duktape python tooling is obsolete, migrate to JS-based tooling! *

                • kragen1 hour ago
                  I was looking at the example build configuration scripts on the project home page. Not somewhere linked from the home page, actually inline code on the page. Today! Literally as I'm writing this comment, it says:

                    $ python2 duktape-2.6.0/tools/configure.py --output-directory src-duktape \
                        -UDUK_USE_ES6_PROXY
                  
                  I'm glad to hear my impression that there was no other build system was wrong!
    • Groxx22 hours ago
      For cases where you can realistically include a full v8 in your project, yea - that seems fairly reasonable nowadays.

      Javascript took an extremely high-complexity, high-cost, and nearly-all-effort-in-just-two-impls-for-over-a-decade route to get to the current (stunning) performance though. How do small-scale embedded interpreters stack up? That's what many things that aren't Electron apps will be using.

      I ask this honestly, to be clear - I'm not sure how they stack up. I know Lua has historically targeted and dominated this area so I'd expect it to win, but I have not seen any data and I largely agree - JS is a massively better development environment now, and I would love to choose it. Everything I can find pits it against v8, and that's kinda like comparing the performance of a bike against a train - they are so different in many situations that it's meaningless. Otherwise I'd probably choose WASM today, or maybe docker containers where performance and platform needs isn't an issue.

    • tecleandor6 hours ago
      The problem I have with Javascript is that, for some reason (mostly, my lack of practice), looks really difficult to me.

      I'm no developer, I do Linux and some bash and Python, and Lua looks "familiar" to me, so I can quickly pull up something simple.

      With Javascript everything looks so alien to me, and it's a pity because there is lots of things that run Javascript (functions in the cloud and the like).

    • DanielHB17 hours ago
      Lua used to be very popular for embedding because the runtime was minuscule so it could run in very constrained environments (like old consoles). This is mostly a non-issue these days though.

      JS and Lua are very similar feature-wise and I imagine Lua hasn't gotten the massive JIT runtime optimizations that JS has, so JS seems a far better choice these days.

      • Isogash16 hours ago
        Once you reach LuaJIT levels of performance though it's not like you need more. If you're doing a huge amount of heavy lifting in your embedded runtime then you've probably designed the application wrong.

        The fat better arguments are that it's more common and there are more libraries.

        • DanielHB5 hours ago
          Yeah that is a good point, I also imagine lua is far easier (build/linking) to add into your application compared to pulling V8 or JavascriptCore.

          So if you only need a small language to do a couple thousand lines of code in than it still might be a better choice. But for larger projects JS still seems better due to the ecosystem and knowledge available.

          But if you compare some games like baldurs gate 2 where most of the event logic and quest state is in the scripting language, then maybe it is worth spending the time to add JS. But if it is like Civilization 6 where only small stuff like UI and I/O mapping is in the scripting language then it is a different matter.

    • SoftTalker22 hours ago
      This probably makes more sense in 2024 but for many years JavaScript was not a serious option. I wasn't until the mid-late 2000's that it started to get traction as an application language and had any way to run it outside of a web browser.

      Lua as an embedded language already had over a decade head start.

      • leptons19 hours ago
        Netscape had javascript running server-side in 1995, only a couple years after Lua was created. So no, Lua did not have "over a decade head start", it had 1 year.

        Microsoft also had server-side JS, and still do. Active Server Pages (ASP) were scriptable with javascript since 1996. Microsoft's JScript.net has been around since 2002 (even if not well supported or maintained by M$, it still works). I've written web back-ends, Windows desktop applications, and even .dlls with JScript.net as far back as 2003.

        Javascript has been Adobe's scripting engine for Photoshop since 2005. Sony chose Javascript for Vegas Video's scripting language in 2003.

        Javascript has so many uses outside of a web browser, and has for a very long time.

        • SoftTalker18 hours ago
          Yes but for the first 10 years (1995 - ~2005) JavaScript was not taken seriously as an application language in most dev shops. Active Server Pages were overwhelmingly coded in VBScript. JavaScript was used in little snippets in "onBlur" attributes to do client-side validation of fields on a form, pop up "alert" boxes, and enable submit buttons. It wasn't until XMLHttp and prototype.js and jquery and stuff like that came along that mainstream developers started to understand what JavaScript was capable of.
          • leptons16 hours ago
            I guess I wasn't a "mainstream" developer in the 1990's? Before XMLHTTPRequest, I was using iframes to asynchronously load content into the browser, essentially accomplishing the same thing before "AJAX" was a thing. We were doing long-polling before there was websockets. I was using Javascript and CSS to create "DHTML" SPAs before XMLHTTPRequest, with Javascript for front-end and back-end. You either know the potential of the tools that are available, or you don't.

            But I guess my anecdotal experience doesn't matter, because I wasn't a "mainstream" developer? From my point of view, everyone else was missing out on using one language for front-end and back-end.

            Context switching has a real cost in terms of developer burnout, and it's something I've happily avoided for 25 years.

    • krapp22 hours ago
      "most people should use the things I use and am familiar with" is a common opinion but you should realize that between Javascript and Lua, Lua is the boring standard language in this context.

      Also there are whole books written on the quirks you need to master with Javascript. Lua has its rough edges but it's a much simpler language, so there are far fewer of them.

      • mardifoufs20 hours ago
        Lua has a ton of sharp edges, they just don't manifest themselves as much as with JavaScript since JS has a different "typical use case" and aren't as often discussed. Writing user facing UIs, with all that it entails, is a great way to uncover and experience all possible sharp edges.

        And a lot of the issues with JavaScript just wouldn't matter for an "embedded scripting engine" use case imo.

      • pavlov22 hours ago
        What’s the end-user segment where Lua is more widely known than JavaScript?

        Game dev, possibly (even that seems highly debatable). But the article was making the case that Lua should be more widely used outside of that niche.

        • mikey_p21 hours ago
          Doesn't matter if it's widely known, it honestly is much nicer to work with than Javascript and is being used as it is intended, instead of being a half-baked language designed in a hurry, to work in a browser, that got repurposed as a general purpose scripting language later.
    • 2OEH8eoCRo019 hours ago
      My exact thought when trying to program my Logitech mouse (which uses Lua).

      I also think 9 out of 10 languages are overrated and we have far too many.

  • xedrac38 minutes ago
    My complaints with Lua are mostly superficial, but they irk me nonetheless. So if I have to use it, I'll usually opt for Fennel.
  • aeturnum22 hours ago
    I have really enjoyed working with Lua when it's come up and I think it's an extremely good language. In particular, as Noë says, its interface for embedding into C/C++ is clean and flexible. There are performance concerns, but considering how widely Lua is used for game logic in high performance video games clearly those concerns can be addressed.

    That said, I think there are other warts that can be confusing and make the language difficult. The Nil-Terminated Arrays section points out one oddity about `nil` - but there are many others. There are a number of confusing corner cases (or were - I haven't used it in a while). As an example, if you have a list of values and you pass that list into a new table, the resulting table may be different than if you add each value in that list in order.

    I also think that the flexibility of embedding means that you are often learning the Lua <=> C interface as much as you are learning Lua. So jumping from deploy to deploy is more confusing than most languages.

    Nevertheless I fully agree the language is underrated!

    • 01HNNWZ0MV43FF22 hours ago
      > you are often learning the Lua <=> C interface as much as you are learning Lua.

      One time I had a memory leak in a game, something like exactly 4 bytes per frame.

      It turned out I was not popping the Lua-C parameter passing stack properly after each frame.

      I found this out after some unhelpful person suggested that "all programs just leak memory" lol

      • spacechild121 hours ago
        The Lua C API is fantastic from a design perspective, but as you have discovered, it is rather error prone. Fortunately, there are many wrapper libraries that take care of all the stack manipulation. For example, I'm using https://github.com/ThePhD/sol2 to interface with C++ and the API is incredibly ergonomic.
        • wruza17 hours ago
          How is it fantastic, it’s just barebones stack interface thrown in your general direction.

          Any integration code that doesn’t use an ad-hoc helper library looks like assembly listing (not even in disguise, cause it basically is bytecode but in C).

          • spacechild18 hours ago
            > How is it fantastic

            See my other reply: https://news.ycombinator.com/item?id=42519033

            > Any integration code that doesn’t use an ad-hoc helper library

            There are mature bindings for most languages. You only need to deal with the C API directly when you

            1. Create a new language binding

            2. Write a Lua extension

            3. Call Lua from C

            Typically, Lua is embedded as a scripting or configuration language into another project, and in this case you wouldn't even be aware of the C API.

        • ahoka19 hours ago
          An API cannot be “fantastic” if it’s error prone.
          • spacechild117 hours ago
            It is a low-level API for creating bindings to other languages and as such typically isn't used directly. (Well, unless you are writing C :)

            What I find fantastic about the stack-based API is that it exposes very few interpreter internals. As a result the authors may drastically change the implementation without breaking the public API. Most importantly, they hide the internal object representation and as a user you only ever deal with plain C types, like integers, floats, C-strings and user data pointers. As a consequence, you don't have to think about memory management because all types are passed by value! In fact, someone could theoretically write a Lua implementation with reference counting and the API wouldn't have to change. It really is a very smart and elegant design.

          • mananaysiempre6 hours ago
            You can only make a C API for interacting with a precise garbage collector that ergonomic. And, to be honest, I wouldn’t call the situation described by the GGP particularly problematic—you messed up memory accounting and you got a memory leak. In most garbage-collected runtimes, if the runtime messes up memory accounting, at best you get immediate memory corruption, at worst you get memory corruption much later in some unrelated part of your heap. So this is pretty mild all things considered. Of course, if you’re working in a language with greater metaprogramming capabilities than C, absolutely do abstract the stack into something more manageable.
          • kragen18 hours ago
            It might be fantastic in other ways; there are multiple ways an API can be good. For example, it could be very expressive and flexible, or it could be especially amenable to formal proofs with things like TLA+ even if it's error-prone in human hands, or it could facilitate very high performance. Even on the error-proneness axis, an error-prone API might still be less error-prone than the alternatives.
    • forrestthewoods21 hours ago
      > but considering how widely Lua is used for game logic in high performance video games clearly those concerns can be addressed.

      It’s ok, but should be much better.

      Once upon a time Supreme Commander used Lua. Perf on consoles for SupCom2 was catastrophically bad. We switched to Kore which was effectively Lua but way faster (10x?).

      Kore was bought by Havok and rebranded into Havok Script. Havok got bought by Microsoft. I’m honestly not even sure if you can still buy it today. It’s not listed on their website.

      These days I think you’d use Luau? I’m not sure. https://github.com/luau-lang/luau

      • a_t4819 hours ago
        LuaJIT is pretty good, but some platforms might not allow it. iOS technically didn't allow it when I used it before, but Apple either didn't _actually_ care or just looked the other way.
  • 01HNNWZ0MV43FF22 hours ago
    I've had fun embedding Lua a couple times, but as with wasm I almost never have a case where I need to run new code without being able to recompile the whole binary, so I don't get to use it often.

    The most fun was when I made a game engine ("engine") in C++ that used Lua for all the game logic. The fast debug cycles are very useful there. Maybe I should still do that for Rust gamedev, since Rust notoriously compiles slowly.

    The second-most fun was an IRC bot that could reload its Lua logic without disconnecting. Nowadays I'd write a bouncer and just let the main process restart. (Or use wasm, a wasm IRC bot would be good...)

    The third-most fun was using Lua as a Turing-complete configuration language. If I needed an array of something, I just made Lua make an array, instead of generating Lua code.

    One of the least fun was the Blender plugin that exported 3D models as generated Lua. If I did that one again today, I'd just use MsgPack or some ad-hoc binary format.

  • ternnoburn19 hours ago
    Lisa is deeply, deeply overrated, ime. But I'm coming from the games industry where it's widely used.

    The only benefit of Lua ime is the ease at which it embeds in C++. Beyond that, it's a nightmare.

    • julianeon18 hours ago
      I'm getting the sense overall that Lua is correctly rated.
  • hipadev2322 hours ago
    Luau [1] developed by Roblox is a backwards-compatible fork of Lua 5.1 with some very significant improvements to speed and QoL [2]. It also mostly alleviates the nil-terminated array problem.

    [1] https://github.com/luau-lang/luau

    [2] https://github.com/luau-lang/rfcs/tree/master/docs

    • orphea2 hours ago
      Happy to see Luau is mentioned!

      Its greatest improvement over vanilla Lua is better sandboxing. Lua is just pain in the ass to embed safely. So many footguns.

  • lopatin22 hours ago
    I think it's a stretch to call it underrated. Isn't it the defacto language for embedded scripting stuff? I've seen it used as a mini language in production in game engines, databases, and trading systems.

    On the other hand, if the article is saying that it deserves to be more highly rated in the non-embedded space, I think I would need more than a handful of syntactical niceties to explain why it deserves to be in the same conversation as Python, Ruby, etc...

  • liberix1 hour ago
    Maybe it's time to check out Luon [1], which has a syntax similar to Oberon and takes some concepts from Lua, it's statically typed and targets the LuaJIT VM. It was recently covered in HN [2]

    [1] https://github.com/rochus-keller/Luon/blob/master/Readme.md [2] https://news.ycombinator.com/item?id=42413343

  • YoshiRulz15 hours ago
    There are a lot of good comments here already, but I can't let this pass by the main page without taking the opportunity to rail against Lua because, no, it's not underrated. It's everywhere and terrible.

    This article is fairly light on detail, only mentioning a couple table footguns, but there are MANY more. (I also disagree with it being "accessible even for beginners", but I'll stick to objective things.) For starters, the size operator `#` doesn't work on map-like tables, and there's no easy way to copy part or all of a table, or to serialise one for printing.

    Lua doesn't have `switch` or even `break`/`continue`. Though it added a `goto`—years after we collectively realised that's an antifeature. You can use `and`+`or` as a ternary, but you need to remember that it works differently with bools and that `nil` is falsey ofc. And `0` is truthy. Using a variable which hasn't been declared yet or is otherwise out of scope gives `nil` rather than an error. In fact most logic errors are SILENT (yay dynamic typing), and some syntax errors are raised far from the actual cause. `<const>` isn't.

    Before Lua 5.3, all numbers were floats. The patterns used for `string.match` look superficially like RegEx but they are not, despite them predating Lua. The stdlib is woefully lacking, with the official documentation seeming to taunt you with examples of how several common functions could be implemented, but it's left to you to copy them into your projects.

    So yeah, Lua might be small and quaint, but that's only because so much is not included "in the box", and what is included is no good.

  • Animats21 hours ago
    Except for "tables". Lua tables are too weird. Trying to make one construct do the work of an array, a struct, and a dict leads to a messy construct.
    • int_19h16 hours ago
      There was a time when this approach was common for dynamic languages in general. PHP and, to some extent, JS also come to mind.

      I think it's clear in retrospect that it makes more sense to keep lists and maps separate even in a high-level dynamic language that doesn't care much about perf, and any purported gains in simplicity from having a single data structure are illusory because of footguns. But perhaps it wasn't so clear before we had enough experience with this kind of design.

      Then again, Python had a clean separation between lists and dicts from the get go, and it predates Lua as well as PHP and JS...

      • Animats15 hours ago
        > even in a high-level dynamic language that doesn't care much about perf

        Lua is mostly used inside games, and the JIT compilers for it are said to be pretty good. (I haven't tried this yet, but will have to soon.)

    • giraffe_lady19 hours ago
      There is also a very common two-liner invocation that turns them into prototypical classes as well. Extremely versatile data structure (pejorative).
      • ramses019 hours ago
        Perl would like to have a word... ;-)

        But seriously, they do have some interesting quirks which make sense if you think of it (and maybe Lua as a semi-contemporary got its ideas from there), specifically, hashes are pairs of keys (even/odd) in a flat array... and of course objects are just key/values, but with some values being names and functions...

        https://perlmaven.com/creating-hash-from-an-array

  • jdblair12 hours ago
    Lua fits the same niche as Tcl, but the runtime is smaller and way better. Tcl started as a simple, embeddable scripting language (and low performance, it was just string substitutions!) and evolved into Tcl8, a mostly backward compatible object system after successive iterations to improve its performance and expressiveness. "everything is a table" gets you pretty far in Lua, in much the same way that "everything is a list" in lisp, and was a much better abstraction than Tcl's "everything is a string."

    When I last used Lua professionally (10 years ago) I did discover some foot-guns related to the handling of NULL / nil in the LuaSQL module. The Lua array length function was based on nil-termination, so a NULL value in a column could cause issues. Maybe this has been fixed by now?

  • pjmlp5 hours ago
    > Lua is used in nvim for plugins since 0.5.0, you bet it's efficient !

    Not necessarily, plenty of editors have scripting options that aren't going to win any performance price.

    Also VI architecture goes back to the days of single digit MHz.

  • ashdnazg8 hours ago
    In the Spring/Recoil RTS game engine[1] we adapted Lua to have deterministic execution allowing synchronous multiplayer simulation. In addition we relatively easily implemented a precise[2] serialization of the Lua VM, allowing us to support save/load for games with no code required from their side.

    In other languages these would require much more effort.

    [1] https://github.com/spring/spring or https://github.com/beyond-all-reason/spring (yay for forks!)

    [2] Preserving the order of iteration on items in a hash table after deserialization and further manipulations.

  • pshc18 hours ago
    Lua as a language is pretty warty, with some obvious omissions. But it’s widespread as an embedded language, and has been around a long time, and I think that tends to give it a nostalgic halo effect.

    I would like to see more minimalist scripting languages with modern sensibilities replace it.

  • thunkle11 hours ago
    I've been writing lots of lua recently. The only two things that bother me are indexes starting at 1. It's made my 2d grid math more complicated. I also don't like how I have to write a goto statement if I want to continue/next inside a loop. Other than that I love the simplicity.
  • K0IN18 hours ago
    I used Lua the first time in Garry's mod, and I had a absolute blast, was my first real programming language, I can vouch, it's easy to pick up :) I also must admit, the gmod lua extension (glua) was so much more enjoyable, that since then I keep my own lua version with patched ! for not and != as alternative for ~= (Luas way for not equal)
    • nxobject8 hours ago
      Ah yes, GLON with a touch of datastream ;)
  • jhoechtl9 hours ago
    Lua as a language is underrated, yes. Lua as the implementatation of PUC Rio is what it is. It misses abstractions in the runtime to make it pre-emptive and parallel. It is also somewhat telling that Mike Pall despite his come back announcement didn't deliver.

    It comes with batteries but they only fitt into one brand well. Compare this to eg golang or Python.

  • tga20 hours ago
    Mandatory plug for Redbean[0], a standalone Lua webserver (a single-file that runs natively on six OSes!), and Fullmoon[1], a web framework built on top of it.

    Not exactly mainstream, but so simple and elegant.

    [0] https://redbean.dev/

    [1] https://github.com/pkulchenko/fullmoon

  • owenthejumper5 hours ago
    The article doesn't mention some of the most famous Lua cases out there: Roblox used it for game developers, HAProxy uses it for extending the LB, and others.
  • enduranceX5 hours ago
    I took Roberto's awesome "Building a Programming Language" course in 2022 and during a live class I had the opportunity to ask why they don't incorporate more stuff into the language. He said that it was designed to be lightweight, effective and simple. Anything that goes against those principles should not be integrated into Lua's codebase.
  • soapdog21 hours ago
    Really liked that post, then again Lua is one of my favourite languages.

    I wrote this post https://andregarzia.com/2021/01/lua-a-misunderstood-language... some time ago and it kinda touches similar points as the OP. Read on if any of yous want to see yet another person talking about Lua.

  • hprotagonist22 hours ago
    especially with fennel on top, i can certainly see the uses and appeal!

    coming from mostly python, i miss a robust stdlib.

    • digdugdirk21 hours ago
      I've been more and more intrigued by fennel lately, and I've been meaning to dive into the functional end of the pool. Any recommendations for how to get started with it, especially if coming from a python background?
      • livrem17 hours ago
        I have learned it (somewhat) by playing around in TIC-80 and Löve2D (including LoveDOS).

        Both Lua and Fennel are tiny languages. You can skim all the documentation in a short evening. Fennel does not even have its own standard library, so keep the Lua documentation close at hand.

        I appreciate the simplicity of both Lua and Fennel. There are some ugly parts, but all languages have those.

        Janet (later project by Fennel's creator) is a nicer language that is also portable and embeddable, and has a nice standard library, but Fennel runs anywhere Lua runs. Janet is great just on its own for scripting and not sure I would want to use Fennel for that anywhere Janet is available. But as a way to script games in e.g. Löve2D it seems like an excellent choice.

      • n8henrie21 hours ago
        Also interested in resources here -- familiar with python, go, rust, but never used a lisp.

        As a learning experiment, I was able to use the fennel website to convert my Hammerspoon config from lua to fennel, but I still struggle to add new functionality. Certainly have not delved into macros, but this is where much of the power lies AIUI.

  • keb_22 hours ago
    I love Lua as a language; even if it can be a bit cumbersome at times, its simplicity and low barrier of entry is refreshing. I'm currently building a full-stack web application with https://redbean.dev in Lua and it's mostly been a joy.
  • habibur17 hours ago
    Add, no way to determine size of Array, which is the primary data structure anyway.

    No item.length propery.

    You might decide to count with foreach(), but that will terminate when it encounters the first value in the array that is NULL.

    • talideon14 hours ago
      Given the way that tables work, the concept of a 'length' is a fuzzy one. However, the '#' operator will give you the last index of a table, which for tables used as arrays is pretty much what you want. There's also the '__len' metamethod, which is what's invoked if it's present if you use the '#' operator. The operator's been in the language since at least 5.1.

      It's a bad idea to think of Lua tables as arrays. They're maps/dictionaries with some extra semantics to allow them to be used similarly to arrays.

    • MomsAVoxell11 hours ago
      > You might decide to count with foreach(), but that will terminate when it encounters the first value in the array that is NULL.

      True only if the programmer doesn’t take the effort to understand the difference between ipairs and pairs.

  • cromantin14 hours ago
    The best way to use Lia is using Typescript - https://typescripttolua.github.io/

    We have huge Lia codebase for game client and game server shared code all written in typescript. Our developers only see lua if something does not work and it’s not often.

  • gustavopezzi13 hours ago
    Lua is a great language for extending systems and very high-level scripting. I like it. I had the pleasure of teaching a module in 2020 together with Roberto Ierusalimschy where we used Lua and LPeg to teach students how to write a simple programming language. It was a very fun class.
  • miohtama22 hours ago
    I guess one of the issues is that because Lua is tailored for embedded use case, it lacks appeal as a general purpose software development ecosystem and thus its usage is marginal.
    • hinkley22 hours ago
      It’s been adopted in game engines to handle story logic, so the people writing the game can be decoupled from the people writing the engine, and don’t have to code in a low level language against a constantly moving target.

      And I would argue that a lot of the staying power of World of Warcraft came from the Lua based addons democratizing the UI. Blizzard co-opted a lot of features from best of breed addons back into the base game. (That and negative pressure in the economy to keep from having every new character wielding the third best sword in the game due to deflation, but that’s a story for a different thread)

  • jezek212 hours ago
    If you like how easy is to embed Lua but not liking it's syntax or features you can try FixScript.

    It's a language I've created that has C-like syntax, the implementation is a single .c file. It's unique feature is an ability to arbitrarily extend the syntax (in fact classes and the type system is implemented as a library). This can be used for specific needs of your project.

    It has also an ability to have time limits for execution and is able to reload scripts in place for live code updates. It also has a JIT by default.

    More information here: https://www.fixscript.org/blog/introduction

  • aulisius17 hours ago
    I recently came across TypescriptToLua[0] which has improved my experience of writing Lua manyfolds. No more having to deal with 1-based indexes or lack of type hints.

    [0] - https://typescripttolua.github.io/

  • jmclnx20 hours ago
    Lua can be used to access the NetBSD kernel, which is interesting. I have said it before, I should spend time learning Lua.
    • kragen15 hours ago
      You won't have to spend very much time. It's a very small language.
  • fire_lake9 hours ago
    Indexes should start at one in all languages IMO.

    The zero thing is a historical quirk due to pointer arithmetic.

  • cyberpunk6 hours ago
    Doesn’t lua use 1 indexed arrays? Breaks my brain.
    • tecleandor6 hours ago
      By default, yes. But you can define them to start in a 0 (doing it index by index, not sure if you can make that default). Or you if you feel like messing with someone, you can start them in any other number. Like -5, for example. :D

      https://www.lua.org/pil/11.1.html

  • 7 hours ago
    undefined
  • dr_kiszonka22 hours ago
    It looks like Lua's conditionals, for loops, and function definitions terminate with the "end" keyword. Is this considered more readable or does it bring some other benefits besides helping the interpreter parse code?

    Many people love Lua, so I suspect there is a good reason for this.

    • bawolff22 hours ago
      All programming languages have tiny differences. Some use }, some use keywords like "end" or "fi", some use indentation like python.

      Its a trivial difference that does not matter.

    • kragen17 hours ago
      It depends on which alternative you're thinking of.

      - }, like C: Lua uses this to terminate tables. You could use the same token for both purposes (Perl5 does) but the reduced redundancy comes at some cost to both readability and syntax error reporting, which are especially important in languages aimed at driveby programmers, like Lua.

      - ): Lisps generally just use ) to terminate every construct, reducing redundancy further.

      - different end tokens for each construct, like Ada and the Bourne shell (endif/end if/fi): improves readability and error reporting further, but requires more reserved words and is wordier.

      - end, like Lua: intermediate among the above choices.

      - indentation, like Python: seems to work well enough in Python, and especially helps beginners who are therefore not permitted to indent their code incorrectly, and adds a maximum amount of redundancy for opening and closing constructs (every line is explicitly tagged with its syntactic depth) without reducing readability. However, this aspect of Python was novel when Python was introduced, remains controversial, and has not been widely adopted in new languages.

      A funny thing about Lua's syntax is that it's not newline-sensitive, but semicolons are optional. This is valid Lua:

          function x(a, b)f(a)f(b)g = a b = a end
      
      Wherever two things are juxtaposed that can't be part of a single statement, it tries to parse them as two statements. This syntactic looseness means that some syntax errors that would be easily detected in semicolon languages aren't easily detected in Lua. This increases the penalty for removing additional syntactic redundancy.
    • 01HNNWZ0MV43FF21 hours ago
      Wikipedia says that came from Modula, but I can't find why Modula did it or why Lua decided to copy it.

      I thought I read somewhere that Lua was meant for use by non-programmers, so the "end" would be easier to type and read than curly brackets.

      • bawolff20 hours ago
        Modula is based on pascal which is based on algol. The "end" thing is in the first version of algol from 1958. This is 14 years before C was invented.

        Presumably they thought the more verbose syntax would be more readable.

        • int_19h16 hours ago
          Algol (and Pascal) originally did the same thing that C did, except that you used `begin` and `end` instead of `{` and `}` when you needed a compound statement as a branch etc.

          Modula changed it so that all structured constructs implicitly introduced a compound statement, and terminated one where appropriate (e.g. on `else`), but this still required some end marker for the last one.

          If the question is why `begin` and `end` and not `{` and `}` - for one thing, Algol was designed to be the language you'd write reference implementations of algorithms in for publication, so readability was more important than terseness. For another, it predates ASCII, and not all charsets in use at the time even had `{` and `}` as symbols.

  • orthoxerox8 hours ago
    I like Lua. It's a saner version of JS with much fewer warts. However, JS has a massive, enormous, insurmountable advantage in popularity. If I need an embeddable scripting engine in 2025, I will most likely go with JS.
  • arccy22 hours ago
    Lua is nice but it being embedded in many different places means that there will often be environment specific context that's injected for you to interact with, but any editor tooling won't know what's injected
    • pseudony22 hours ago
      Not necessarily true. LuaLS, the most popular LSP, uses LuaCATS (Lua comments and type system) annotations. You can place annotation files (type stubs) in a directory and the LSP will pick them up.

      I use this for a project of mine where I embed a Lua interpreter where I made several additional Lua modules available.

    • leptons19 hours ago
      I ran into this recently with OBS scripting. I don't really like Lua, so I wanted to try making a Typescript-to-Lua toolchain specifically for OBS scripting. Well OBS injects a ton of stuff into the Lua environment, and I'd have to find a way to mock all of that in Typescript. Sadly, I don't have time for it, and I've chosen a different product than OBS because I don't really want to invest so much time learning a new language to accomplish something with free software when I can purchase something that just works for $40. My time is way more valuable, and this is the only time I've ever needed to code something in Lua. Yeah, OBS supports Python too, but I dislike Python even more than Lua.
      • kragen17 hours ago
        I'm surprised to hear that there's proprietary software that's competitive with OBS Studio. What is it?
        • leptons16 hours ago
          I just needed a program to capture a portion of the screen under my mouse and stream it as a webcam device to Zoom, so I can screen cast a portion of my gigantic workstation desktop (6480x3840) to people with 1920x1080 screens, while following my mouse cursor around.

          OBS was one stop along my journey to get there, as there's some Lua and Python scripts that achieve something similar, but not quite exactly what I wanted with multi-screens, so I began trying to modify the script to do what I needed it to do.

          In the end I'm going with something called "SparkoCam" which achieves this easily with no hassle, but it costs $40. Since $40 is less than what an hour of my time is worth, it was kind of a no-brainer to abandon the hassles of learning Lua (a many hour detour) and fiddling with OBS Studio to get what I wanted. I have no real use for Lua outside of this, so it would be time wasted on this one project.

          • kragen16 hours ago
            Interesting!
  • eviks8 hours ago
    > The one that bothers me the most, might be the fact that arrays (tables used as arrays) are nil-terminated, meaning the end of the array is marked by a nil value

    This alone should underrate the language even more so that no poor user is exposed to this kind of nonsense

  • garbagelang14 hours ago
    This LLM written post is bad and you should feel bad.
  • synergy2022 hours ago
    i wish luajit has been actively evolving though
    • hinkley22 hours ago
      It’s hard to tell since there’s no changelog, but it looks like there are at least some improvements ongoing other than just simple bug fixes. Some of the changes in August for instance look like more than just for correctness.
    • mardifoufs20 hours ago
      It is evolving, they just don't want to take the same direction Lua has been taking since the past few releases.
  • protomolecule22 hours ago
    It's my language of choice for letting users customize an app, but the last client wanted Python because many people are already familiar with it and 'Scriptable with Python' looks better from the marketing point of view(
  • EGreg22 hours ago
    Is Lua currently even faster than JS and PHP, after all the effort that went into those languages’ optimizations?
    • shakna22 hours ago
      It depends. In Debian's [0][1] benchmarks, JS is clearly ahead in a handful of the benchmarks, but in the rest, Lua is either on-par or faster. They're within a shout of each other.

      There are some places where Lua has surprisingly poor performance. However, in almost everything else, it's one of the fastest around. Luajit is still one of the fastest JITs in existence.

      In the real world, rather than constraints of a benchmark, Lua tends to be on-par from what I've seen. When you're not cold-cacheing everything, its memory use is lower. But it's standard library is ISO-C, and that's about it. No batteries. And the others beat it soundly there, whilst being close to the same performance. Which will mean they will just about always win out.

      [0] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

      [1] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

      • igouy19 hours ago
        > a handful of the benchmarks

        Now made a little easier for your comparison

        https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

        • Rochus17 hours ago
          Thanks. I calculated the geomean of the speed-up factors based on the average results (cpu secs) per benchmark. Overall, Lua 5.4.6 is factor 5.1 slower than V8 v23. Only in k-nucleotide Lua is slightly faster than V8, but in n-body Lua is even factor 26 slower.

          Btw. it seems notable that Lua 5.4.6 is about 40% slower than Lua 5.4.1, see http://software.rochus-keller.ch/are-we-fast-yet_Lua_results....

        • shakna11 hours ago
          Unfortunately, the comparison pages don't actually show every test, or even the same run, as the main pages. You'll find a lot of differences between what I linked, and what is linked here.
    • Rochus19 hours ago
      I did performance measurements in 2020 based on the Are-we-fast-yet benchmark suite which combines micro and larger benchmarks.

      Here are the results: http://software.rochus-keller.ch/are-we-fast-yet_lua_results...

      Overall, the JS/V8 version was twice as fast as the Lua/LuaJIT version. LuaJIT was only faster in the Mandelbrot benchmark. The recent PUC Lua version 5.4.1 is even ten times slower than the V8 version at the time. More recent V8 will likely be even faster. The results also demonstrate that one can get a pretty false impression when just looking at single microbenchmarks.

  • kragen17 hours ago
    I feel like, except for small size, this post doesn't really get into what I like about Lua, which is pretty unfortunate. Lua deserves a better case than this.

    I'm maybe not the best person to write this because I'm not really a huge Lua fan. Though on projects with others I'll use whatever language they're using, left to my own devices, I probably write about 3× as much JS as Lua, about 32× as much Python as Lua, and about 20× as much C as Lua.

    But Lua has some really nice attributes.

    It's a pretty reasonable very-high-level language comparable to Python, JS, Groovy, Perl, or Clojure. You can get a lot done in very little code. You don't have to crank out piles of boilerplate the way you do in Java or sometimes C. It comes with concise and readable syntax, flexible data structures, garbage collection, strong dynamic typing, list structure, hashmaps, dynamic method dispatch, inheritance, operator overloading, cooperative multithreading, eval, exception handling, namespaces, closures with lexical scoping, etc. Its documentation is first-class. Most of this is pretty much standard for the dynamic-language category nowadays; there's not much special in here to pick one language from the category over another.

    Lua's biggest advantage is that LuaJIT is motherfucking alien technology from the future. I wrote a straightforward escape-time fractal renderer in https://gitlab.com/kragen/bubbleos/-/blob/master/yeso/mand.l... with a dynamic function invocation in the inner loop (to switch between fractals) and LuaJIT's trace compilation just compiles that right out. There's also a real-time animated raymarcher in that directory in LuaJIT, though the framerate and resolution are pretty bad. You can literally just write high-level code at like a Python or JavaScript level and, as often as not, it just goes faster than C. (Sometimes it doesn't. JIT compilers are unpredictable and sometimes disappointing.)

    Now, probably somebody is going to reply to this and tell me that everybody runs JS in JIT compilers too, so this isn't a real difference. Unfortunately that's horseshit. I'm sorry, but V8 and SpiderMonkey are just not in the same league, though not because their implementors are less brilliant. They're just attempting a much harder job, because JS is a much hairier language. Consequently their performance is an order of magnitude worse.

    (Even regular non-JIT Lua is a lot faster than most other very-high-level languages, almost to the level of JS JIT compilers.)

    LuaJIT's C FFI is also just ... there's no expression in English sufficiently emphatic to express how great it is. The expression I'd normally use translates literally as "it's a very child of ten thousand whores," which hopefully conveys some of the emphasis if not the content. Here's the FFI binding for Yeso, the graphics library used for the above fractal demo: https://gitlab.com/kragen/bubbleos/-/blob/master/yeso/yeso.l.... Yeah, just by pasting the relevant lines from a C header file into your Lua program, you can call C code from a shared library as if it were Lua code. It's almost as easy as calling C code from C++!

    After getting screwed over (along with everybody else) by the Python community's gratuitous and ongoing breakage of backward compatibility, coupled with public shaming to persuade more people to break backward compatibility, the Lua community's approach to backward compatibility is looking pretty appealing. The vibe is that old versions of the language live forever, and you should keep using them instead of upgrading, but new versions usually aren't compatible. However, it's much easier to make a Lua library compatible with every existing version of Lua 4 and Lua 5 than to make a Python library compatible with any version of both Python 2 and Python 3.

    Being small, fast, and implemented purely in portable standard C makes it practical to run Lua even on larger microcontrollers like the ESP8266, even when they use weird CPU architectures. Even interactively. https://github.com/nodemcu/nodemcu-firmware

    LuaJIT is much less portable than PUC Lua, and also more than twice the size, about 580kB on my machine. This is still orders of magnitude less than something like Python or the JRE.

    Lua is a little more annoying than Python or node.js to develop interactively with, but it's overall pretty comparable. For code I'm writing myself (as opposed to getting from a library) the Lua version usually ends up being just a little bit bigger than the Python or JS version. Lua is more bug-prone, though, due to a number of language design flaws.

    Lua is a very newbie-friendly language, I think even more so than JS or current Python. It's very successful in systems aimed at beginning programmers, like Minetest (Luanti), World of Warcraft, and LÖVE2D. The barrier to entry is very low; you don't have to be learning the GUI of a new IDE at the same time that you're learning the programming language and, possibly, how to program. The standard library is small rather than overwhelming. And the download for standalone Lua development is tiny.

    • isr16 hours ago
      I know +1's are frowned upon here, but, err, +1? But, to cater to one of your criticisms - namely that the surface syntax of lua might be a bit better. It's not bad, per se, but it doesn't quite match the elegance of the lua's semantics or it's flexibility.

      In which case, the regularity of lua's semantics come to the rescue. It's much easier than most to develop an alternative syntax which compiles down to it.

      Notice I said "alternative syntax", and not "other language which transpiles to lua". Because in the case of moonscript (or yuescript) or fennel - you're still writing lua. Using lua semantics. With no runtime overhead.

      Yes, you get this to varying degrees in other ecosystems (javascript et al). But it seems to me that the friction within lua is even lower. Once you buy into "closures all the way down" & "use tables for everything", then no matter what surface syntax you're designing, it's still, pretty much, lua.

  • shmerl16 hours ago
    I started using Lua for Cyberpunk 2077 mods and neovim configuration, it's fun to use.
  • pdimitar19 hours ago
    No transparent parallelism? And no, I don't mean threads.

    Underrated? Eh... it does stuff better than others but that doesn't mean much.

    We really don't learn.

    • RossBencina16 hours ago
      Lua has coroutines, and you can use a coro scheduler such as lanes if you wish.
      • pdimitar16 hours ago
        Cool, good to know, thanks for bringing it to light.

        Still not a fan but it's nice that it has the facilities.

  • cute_boi22 hours ago
    I feel Scala is also very underrated.
    • fsloth22 hours ago
      How do you use Scala as an embedded scripting language?
  • aaroninsf13 hours ago
    Did they ever fix the 1-index thing?
  • booleandilemma17 hours ago
    The 1-based indexing feels too icky to me. It was a language designed for non-programmers and it shows, imo.
    • unleaded17 hours ago
      This is a very common complaint about Lua and i really don't understand it. I get that it's weird but is it hard to remember it while you're programming or something? I never run into it much anyway but I guess it depends on what kind of code you're writing.

      I think there's a lot more to language choices than syntax and weird behaviors like that, it's more about how you run/implement the language and the ecosystem and things like that. Think about when people say javascript is a bad language because if you add together two empty arrays you get the number 0 or something. Or think about how you might be annoyed if you had to use a desktop app written in PHP (yes people do this) because the author likes PHP..

      • dmpk2k8 hours ago
        I used to think the same as you about 1-based indexing in Lua... until I tried using Lua in a game engine. I was constantly getting hit in the face with problems (e.g. coordinate systems and off-by-one errors), even after spending far too much time trying to abstract around this mismatch to make these problems go away.

        In theory I was okay with the indexing. In practice it was an absolute PITA.

        Plus there were other numerous papercuts; it felt like Lua was actively fighting me every step of the way. I went from liking Lua to absolutely despising it.

        In the end I switched to C++11, which amazingly was much more productive. As in night-and-day difference.

  • wruza17 hours ago
    Lua is not underrated. It’s how many now, still 4(?) different languages which continue to suicide after a few years, all with cringe [re]design decisions, half-assed mechanics and lack of convenience that authors refuse to address. I spent many years on it (embedding, metaprogramming, all the deep things) and it is one of my biggest regrets in the choice of tech. The amount of effort you’ll put into not sliding out of this crazy ride nullifies all the promises of easyness, which is limited to trivial things anyway and explodes with complexity in any non-trivial data exchange. Its only upside is low footprint, but I fail to see where it matters today (apart from nodemcu whose active users could probably fit in one closet).

    https://github.com/nodemcu/nodemcu-firmware

    Based on Lua 5.1.4 or Lua 5.3 but without debug, io, os and (most of the) math modules

    As expected, they didn’t bother to support all versions and probably were done with it after 5.4 announcement.

    • unscaled16 hours ago
      Lua is definitely not underrated as an embedded Language. It is the uncontested winner of this field. If anything, I feel it is slightly overrated: it is easy to embed, but has no other real strength. Everything else feels about Lua feels a little bit meh for me. It is completely serviceable, and there are many apps where I have used Lua very effectively like Hammerspoon or Wezterm. But in the end of the day, every time I have to deal with a system that embeds Python or modern JavaScript, I am 2-5 more productive than I am with Lua. Sure, Python and JavaScript have their own warts, but they have grown and improved their ergonomics since the 90s. Lua, on the other hand, is a language that is philosophically hostile ergonomics.

      Lua's philosophy, as far as I can get it, is to be minimalist and allows for emergent features. You want arrays? We don't natively support arrays, but you can use tables as arrays. You want classes? You can build your own class system or prototype system using tables. You want to implement anything else? Well, we've got some metaprogramming for you. The problem with this approach, is that you don't end up with one language, but with hundreds of different and confusing dialect, where each programmer does their own thing, this is how JavaScript used to be with regards to concurrency and classes, and it wasn't good.

      The other issues that arise from this philosophy is that Lua lacks a lot of ergonomic features which cannot easily emerge out of its simple tables+closures+coroutines foundations. There are no type hints (like Python got), const (like JavaScript), default argument values, array/table destructuring, try/catch blocks, string interpolation, foreach loops, immutable values, etc.

      These features sure add a lot of complexity to the language implementation, but they are extremely valuable in saving programmer time and cutting down bugs. Lua takes the Worse Is Better approach here and saves implementation complexity by pushing it down to the user. If this complexity doesn't exist in the Lua interpreter it doesn't disappear: the programmer has to deal with it by making bugs more likely and debugging harder.

      Lua might be a worthwhile trade-off where the choice is between embedding Lua or Embedding nothing, since other languages are harder to embed. Or in industries where it's deeply established like gaming. But I cannot say it's a language I enjoy programming, and I don't see any reason to use it in a non-embedded context. Apparently no one else does, but that's not because Lua is underrated.

      • fallous9 hours ago
        Your description of use-cases for Lua reminds me a lot of the use of Forth in the 80s and 90s.
      • raincole15 hours ago
        Lua kinda has an unofficial type hint implementation (via lua-language-server). But at this point I would just use TypeScript.

        > Lua's philosophy, as far as I can get it, is to be minimalist and allows for emergent features

        IMO this philosophy only shines in languages with macro, e.g. lisp family.

      • m46311 hours ago
        I have several friends who have worked on embedded interpreters, and one has used lots of forth, the other lots of tcl.

        WOnder how they compare?

    • raincole15 hours ago
      Personally I would choose JavaScript over Lua anytime*. Lua gives me "poor man's JavaScript" vibe.

      * (If the target hardware can afford V8/node runtime, obviously)

      • 8 hours ago
        undefined
      • realusername9 hours ago
        > * (If the target hardware can afford V8/node runtime, obviously)

        If it doesn't, I'd still use Duktape or QuickJS over a Lua engine

    • mardifoufs15 hours ago
      What happened with the 5.4 announcement?
      • wruza15 hours ago
        You eventually become unamazed by news like a new Lua version and lose interest. It was interesting and controversial in 5.0…5.2 period, but then code and userbases established and no one wants to break code or infuriate users. With all the half-subjective flaws, previous (alternate really) versions of Lua are absolutely “done” and usable, so just sticking with one is a readonable thing.
    • MomsAVoxell11 hours ago
      What products have you made with Lua?
      • wruza11 hours ago
        Trading bots for Arqa Quik.

        Mostly functional cross-platform animated ui toolkit based on cairo/pango/gdk. Never published, probably still have a repo. I was proud of it cause it was better than gtk (if you ignore gc-related stutters).

        A backend for a boring accounting-related project.

        I also have developed an active-record like platform using Lua, but dropped the idea eventually. It was a repl-devenv-like thing with “active” ui, data, transparent networking, etc.

        In product-ish category that’s all probably. Not much of it were actual products, as in portfolio sense.

    • 7 hours ago
      undefined
  • helpfulContrib22 hours ago
    I was infected by the Lua mind virus 15 years ago and I have not recovered.

    I simply don't see any reason not to use it .. for everything.

    I've built games with it. I've built GUI's with it. I've built scientific analysis tools with it. I've used it as a total solution for previously-impossible embedded problems. I've used it for desktop applications and mobile apps. I've automated entire production lines with it. It has huge appeal as a general purpose software development ecosystem - but those who know how to do this, tend to keep it to themselves for some reason.

    It can do everything.

    There's just nothing I want to do with computers that I can't, legitimately do, with Lua - or its VM. Or, LuaJIT. Or, all of the above and a bit of luarocks built in. Throw some luastatic in while we're at it, bundle it all up as a .deb, and nobody needs to know a thing.

    Its just so versatile. I hope someone builds an OS around it some day, making everything and anything the user wants to do, accessible through a Lua module, or some smart libffi wrappers around a handful of .so's, somewhere ..

    In my retirement, I'll take antirez' LOAD81, turn it into a proper Lua editor, add integration with luarocks and luastatic and maybe a shit-tonne of IPFS, and then there won't be any need, ever again, to deal with anything else.

    • HexDecOctBin5 hours ago
      How do you debug it? I tried a little bit and kept running into issues due to the dynamic typing.
    • discreteevent19 hours ago
      This happened to me about 6 months ago. (I didn't seek it out, just ended up working on something where lua was involved). So thanks for the warning. I didn't realise it could last for 15 years or more. The big thing for me is how easy it is to make something that feels like a DSL while it's still just Lua when you debug it.
  • 22 hours ago
    undefined
  • garbagelang14 hours ago
    [flagged]
  • beanjuiceII21 hours ago
    nah, it's overrated ackshulaly
  • jmnicolas21 hours ago
    Meh. I don't get the love Lua gets on HN. At least version 5.1, I didn't try the others.

    No first class support for OOP, you got to use tables instead.

    No "continue" keyword to go to the next iteration of a loop. So you have to use some ugly elseif.

    No try catch!

    It's a dynamic language so everything goes. A function will return nil or a number or whatever you wish. A function can also returns any number of variables, so obviously some moron will abuse it: looking at you Blizzard intern (you must have been an intern right?) that thought returning 20 variables on the combat event log was a good idea. Ever heard of tables???

    The LSP can't do miracles when there are almost no rules so auto-completion is more miss than hit, which is quite shocking in 2024. The IA is helpful sometimes but create subtle bugs some other times. A simple uncaught typo will create a hard to find bug (yesterday copilot auto-completed myObject.Id instead of myObject.id, there went 20 minutes of my life trying to find why some code that was running solidly for weeks was now failing silently).

    So all in all, Lua is fine to write small imperative scripts of a few hundred loc. Anything bigger and more complex you should run away IMHO.

    I never realized C# was such a well designed language until I tried Lua.

    • IshKebab19 hours ago
      I agree. Also tables are quite weird, and 1-based indexing is definitely a mistake (sorry 1 fans but it is).

      I think it just doesn't have much competition in the "embeddable languages" space unfortunately.

      As another commenter said, JavaScript is probably the best choice in most cases today.

      There's a list of options here but most of them are not really good options usually:

      https://github.com/dbohdan/embedded-scripting-languages

      E.g. Python would be a terrible choice in most cases. Awk? Some of them aren't even general purpose programming languages like Dhall.

      If you narrow it down to embeddable-like-Lua there are hardly any.

    • cardanome17 hours ago
      > No first class support for OOP, you got to use tables instead.

      You get meta tables instead which allow you to trivially write an OOP system that perfectly fits your needs. Yes there is more than class-based OOP.

      And I mean trivially: https://lua-users.org/wiki/SimpleLuaClasses

      For some people not being locked into one form of OOP or any OOP at all is a huge advantage.

      > No "continue" keyword to go to the next iteration of a loop. So you have to use some ugly elseif.

      Yeah, that sucks. Thankfully Luau fixes that and basically all annoyances I have with Lua. https://luau.org/syntax

      > I never realized C# was such a well designed language until I tried Lua.

      Lua and C# have fundamentally different design goals. C# sucks for embedding, you have to include a huge runtime. Godot still hasn't figured out how to support C# on web builds. Lua values minimalism and flexibility. That obviously means it lacks some quality of life features of bigger general purpose languages.

      Language design is about trade offs.

    • garbagelang14 hours ago
      [flagged]
  • pentaphobe22 hours ago
    Out of curiosity: does this post read like LLM (generated or assisted) to anyone else?
    • driggs17 hours ago
      An LLM would have written a much more convincing argument that "Lua is underrated".

      This blog has just three posts, none with significant content. I was curious how this article even wound up on HN, until I realized that it was submitted by the blog's author: https://news.ycombinator.com/user?id=nflatrea

    • lioeters19 hours ago
      Pretty sure in a year or two the ratio will flip to where people will pick up on articles that are not assisted by LLM. Does this post read like it's been written completely by a human? And the models will probably get better until it's mostly indistinguishable.
      • leptons19 hours ago
        LLMs and bots will probably be the thing that leads to a second version of the internet being created. Eventually the thing we currently know as "the internet" will be nothing but bots and LLMs continually regurgitating in a giant 100-billion-watt-consuming feedback loop.
    • johnzim20 hours ago
      The conclusion at the end definitely does
  • TekMol22 hours ago
    When it comes to languages, I was enlightended when I changed from {}; languages to Python. I could never go back.
    • Trasmatta22 hours ago
      Ruby is the sweet spot for me. I don't like the syntactic whitespace in Python, but still prefer the cleaner look it not having braces everywhere.
      • librasteve22 hours ago
        raku - the bastard child of functional and OO

          class MyTable is export {
            has @.data;
         
            multi method new(@data) {
              $.new: :@data;     
            }
        
            method render {
              table :border<1>,
                tbody do for @!data -> @row {
                  tr do for @row -> $cell {
                    td $cell              } };
            }
          }
        • DonHopkins19 hours ago
          Many languages like Ruby have sweet "Syntactic Sugar", but Raku has "Syntactic Syrup of Ipecac".
          • librasteve1 hour ago
            lol - thanks for adding to my vocab!

            [in defence of raku, while it truly is marmite wrt Ruby / Python, there is a very loyal core of smart people that love the freedom and expressiveness it offers]

    • tail_exchange18 hours ago
      It's the other way around for me, and I don't understand why people like Python's system so much. Moving code around and changing scope in other languages is just a matter of adjusting the braces, but with Python I need carefully reindent all affected lines. It's error prone and annoying.
      • anonzzzies14 hours ago
        People say (and it is true) that you read more than write, so your careful indenting is for that case. I don't find python easier to read than c; I find the verboseness quite annoying actually. Plus what you said in usability for the coder. It is nice as teaching language but beyond that I have no clue why people use it; if you read the modern AI code in it, it's not very readable at all as that is not about python but rather numpy and torch etc which could've been done in another language which might have been nicer for this purpose. Now (check for instance gpt2 or so) it's verbose typing/reading for stuff that's not readable for someone who doesn't understand the theory anyway.
    • 19 hours ago
      undefined