Love & hate.

I am not a developer, or programmer. I am a simple user. Well, I'm a bit more knowledgeable than the average user :)


The programming language I hate the most is lua. The most favorite is Lisp (list processing), a functional programming language.
I'm curious, why Lua? Used in the right context of being a powerful scripting language I like it.

Love: C, R, Go, Rust, Lua, Haskell
Hate: javascript
 
I'm curious, why Lua?
I will try to answer as briefly as possible :)

Dynamically-typed languages in general tend to be more error prone, but Lua has problems that are not inherently caused by dynamic typing. Some of these also apply to other languages, such as JavaScript and/or Python, but Lua has a particularly bad combination of these problems, and some of them were fixed in JavaScript by the “use strict” syntax (an indication that there is a demand for avoiding these problems in programming languages). Typescript was created because of problems like these in JavaScript.

Examples:

1. Accessing an undefined variable or class member (table entry) is not an error (it returns nil). For example, if a developer mistypes the name of a variable, etc., the code will compile, and the error would be discovered only much later, possibly after spending a lot of time investigating an error caused as an indirect result of this error. Using magic strings/values is generally considered bad practice in almost any language, but almost every user-defined name in Lua is effectively a magic string/value. (Globals, probably locals, and class members are accesses to some sort of ‘table’ (map) using a string.)

It would be more useful for this to return an error, but have an explicit way of testing whether a key exists in a table (maybe by returning nil only when indexing is done using square brackets).

2. Implicit creation of variables: Variables can be assigned without being declared. Similar to point 1, this might not be what the developer intended.

3. Implicitly created variables are global. (Similar to JavaScript.) Extensive use of globals is considered a bad programming practice in general, so having the language do so by default is quite undesirable.

4. Methods can be called on an instance of an unrelated class (because passing the self-reference is explicit), i.e. even when using an instance (a table used as an object instance) to invoke a method, you can pass a different (potentially unrelated) instance as the 'self' reference. Being able to call a method on the wrong class is not a useful feature.

Of course, this is partly due to Lua’s flexibility, which enables it to be used as an OO language, while not necessarily explicitly designed as one. However, it does have syntactic sugar specifically for OO. The designers could have avoided this problem by enforcing a rule that functions declared using the `:` syntax must be called using it (thereby requiring them to be used in an OO way).

If a developer forgets that a certain method requires a self-reference (i.e. is an instance method), they would end up calling the method on what should have been the first parameter to it after the self-reference, potentially having the method modify that instance (if it is a table), and resulting in a corrupted/invalid state of it, which might be discovered only much later when it causes something else to fail (at which point, it may be difficult to determine what caused it to have that state).

5. Function parameters are not validated.

Not ensuring that they are a specified type is a common problem of dynamically typed languages, and since all parameters are optional, passing too few is not an error (though a better solution would have been to provide a syntax for indicating whether each parameter is optional), but Lua doesn’t even give an error when too many arguments are passed. This is not a result of a useful feature.

6. Lack of data hiding. (Similar to Python and JavaScript.)

7. Lack of constants. Developers have to choose between using variables or hardcoding constant values as literals (potentially in many places). The latter is more efficient, so lack of constants encourages another bad practice. This is probably done to keep the language small, but introducing them for primitive values would have no runtime overhead. (The value could be substituted when generating the bytecode).

When modifying code in such a way that its interface (how it is called) is changed, in most languages, calling code that is broken by the change would fail to compile, thus alerting the developer to what has to be changed. In Lua, the calling code will typically fail only at runtime, and not even directly cause an error then, leading to a subsequent error somewhere else.

When debugging, a proper OO structure (avoiding the problems mentioned above) narrows down where you have to look for possible causes of a bug. For example, if a field is private, you have only to check the code of its own class to see how it got a wrong value. When you don't have the ability to make it private, you can't be sure that noone on the team forgot that they weren't supposed to modify it from outside of the class, and so the error could be anywhere.

The need for compile-time checks is more important in an embedded scripting language (the intended use of Lua), in my opinion, because it is more likely that it will be used by inexperienced developers.

It is even more important when used for scripting games, since games tend to be more difficult to debug.

Also, it has some unusual and inconsistent operator names:
The unary and binary `~` operators have different meanings (bitwise 'not' and 'xor'), and don't appear to be related to the use of "~" outside of Lua (for example, it can mean 'approximate');
`~=` (for inequality) resembles the general 'approximately equal' symbol;
`//` (integer division) is a comment in C++-based languages.

As others have pointed out, it lacks features of other languages, and has a smaller standard library, but this has advantages too, making it small and efficient.

The small size of the language specification is also an advantage for a scripting language, since it means that users can learn it quickly, but its error-proneness makes it particularly unsuited to beginners.

Some features are bad for performance: accessing almost anything requires a table lookup (but dynamic language interpreter implementations are likely to involve a lot of lookups anyway, and it's designer to be implemented as an interpreter); lack of more efficient data types (though version 5.3 introduces integers), for example you can't use double precision where you need it and still use single precision in other places. But this makes it simpler, and it achieves quite good performance anyway (when comparing Lua interpreters to other interpreters and Lua compilers to other compilers).
 
While in general, I don't disagree with you, I'll add a few "but what about" comments.

Dynamically-typed languages in general tend to be more error prone,
That's the price you pay for flexibility and efficiency (here: how fast to develop, not CPU usage). It leads to a different development method; if you need to build reliable systems in dynamically-typed languages, you use coding standards and testing methods differently.

even when using an instance (a table used as an object instance) to invoke a method, you can pass a different (potentially unrelated) instance as the 'self' reference. Being able to call a method on the wrong class is not a useful feature.
Actually, it can be useful. It's an example of "duck typing": If I can call the "walk" method on an object, and it walks like a duck, then it is a duck for my current purposes. But it implies that the developer has to be careful, and think through what "walk" means.

Lua doesn’t even give an error when too many arguments are passed. This is not a result of a useful feature.
Other systems share that behavior of ignoring extra arguments. For example pyplot = matplotlib: You can pass arguments such as "color=red" to functions that don't draw anything that can be colored. The design principle there is that an argument such as "color" is honored when it makes sense to honor it, and ignored when it makes no sense. This simplifies design, because if you want everything visible to be red, you just add "color=red" to every function call.

Lack of data hiding.
I don't know any language that really has data hiding. Matter-of-fact, it's even debatable whether data hiding can be implemented by a programming language alone, without support from the hardware (to tag memory cells by type, and to divide software into rings and do access control by ring). You think C or C++ have data hiding? I'll take the pointer to that C++ object, cast it to void*, cast it to an array of chars, and your data is completely out in the open, naked, every bit of it.

What languages do have: they make accessing "hidden" data more or less difficult. If it is really difficult, a developer who modifies existing code is more likely to see that someone is deliberately looking at hidden data; in language like Python, one has to rely on the "single underscore in front" convention instead.

For example, if a field is private, you have only to check the code of its own class to see how it got a wrong value.
And you have to look at every line of code that can write to memory using a pointer. Because that pointer could be wrong. And in C/C++ that means you have to look at every single line of code.

In reality, things like data hiding, modularity, and all other forms of "preventing mistakes" are not a technical question, but a sociological one. And you explicitly acknowledge that in this statement:
The need for compile-time checks is more important in an embedded scripting language (the intended use of Lua), in my opinion, because it is more likely that it will be used by inexperienced developers.
BINGO! If you have a team of super expert crack programmers (like Aho, Weinberger and Kernighan, or Ritchie and Thompson), then you can write high quality and ultra-reliable software (such as awk or Unix) in any programming language, even totally crappy ones (like C). If you only have mediocre programmers that are inherently clueless and unreliable, then you have to put "guard rails" around them. For example, take away their ability to use pointers (which is why Java and Python help), so they can't just hack around restrictions. Write and enforce very strict coding standards, so the source code clearly shows the intent of the developer, and dirty tricks are immediately visible and obvious. Use very strict testing regimes (both automated testing, such as continuous builds, and human penetration testing), to make sure nothing gets overlooked. And so on. This is the difference between computer science and software engineering: Engineering means to build useful systems in the real world (with all its messes, shortcomings and compromises), as opposed to reasoning about what is theoretically possible.

The unary and binary `~` operators have different meanings (bitwise 'not' and 'xor'), ...
`//` (integer division) is a comment in C++-based languages.
The area of not, xor, exponentiation and modulo has never been handled consistently between languages. For example, look at how "^" and "**" are abused. My favorite example (of something to hate) is how C++ took the >> and << operators (which used to mean bit shift) and turned them into input/output. What this really means is that languages are domain specific. For example, in languages used for floating-point calculation, the operator "^" really should mean exponentiation, in languages used for bit-wise operations (which happen in logic design), & and | and xor (however spelled) need to be first class citizens, and so on. If you are specializing in string operations (for example the old SNOBOL language), use concise and clear operators ... for string operations, not for arithmetic or boolean. Beauty really is in the eye of the beholder.
 
I suspect those days are mostly gone....
Big money software tends to be aimed at big organizations... like MyChart, a patient portal that many hospitals use. There's no pricing info for MyChart licenses...

To combat that, we have ownCloud/nextCloud in ports, and plenty of pretty good web front-ends available for free. If you have the time, expertise and hardware available, you can put together something that rivals MyChart. Trouble is - MyChart has cornered the market already. So, even if you can put something together, there's no more money in that.
 
(Off-topic) I also don't think a group of amateurs can put together something like MyChart. To build systems that can be used in the medical field, you need to understand a few things: (a) How the medical industry works, meaning what are their workflows, how are tasks done. (b) Where the funding in the medical industry comes from, so you can build solutions that are economically viable and sensible. (c) How regulation is handled, like FDA approvals, HIPAA compliance, and the worldwide and local variations on that (for example, in the US Medicare regulations and reimbursement schedules are state-by-state, not nationwide).

People who write free software often start out as "programmers" (technically called software engineers) looking for a hobby: "Oh look, I wrote this small perl script to drive my toy 3D printer, let's put it on Github and it will dominate the CAD market", or "Oh look, I have this C code that can do some image manipulations, maybe it can be used with some AI to find cancer in x-ray pictures". Sometimes this works (Linux after all started like this). In medical, this absolutely does not work. You want to build medical systems, you start with a dozen people doing opportunity and requirements analysis, another dozen people who have industry experience and know the patient- and money flows, and a set of people who know how to do the paperwork stuff. The single most important hire for a medical company is the "Vice President of Regulatory Affairs".

Anecdote: I used to work for a large computer company, it used to be the largest of them all. They had lots and lots of really smart people, both in computer science / software engineering, and in many other fields of science and engineering (for example a few physics nobel prize winners). At some point in the early 2000s, they decided to use some machine-learning software that had become famous for cracking a television game show, named after the assistants to both Mr. Bell (inventor of the telephone) and Sherlock Holmes (that's a joke). They wanted to use that technology (which they thought was revolutionary) and use it to revolutionize information technology and data processing in the medical industry. And in the process greatly improve the efficiency and service delivery in medicine. They fell flat on their face, completely. After investing many billions of $, they sold the wreckage for pennies on the dollar. What went wrong? Hubris: they thought they were so smaaaaht, they ignored that the medical environment is difficult to work in. A better solution that doesn't get FDA approval isn't a solution. A good idea which forces hospitals to work in a way that is demonstrably impossible, or that needs money in places where money traditionally doesn't exist, is not a good idea, even if it looks good on paper. Another part of the problem was hype: The medical industry is a very large part of the GDP, and one that traditionally doesn't spend much on IT (much less than other service industries). So the computer company thought that they could insert themselves into the middle of it, take 10% of the healthcare revenue (which would be a huge amount of money), in exchange for making hospitals 11% more efficient. They sold the idea to wall street with such numbers, completely forgetting that to do rent-seeking, you first have to actually have something to rent out.

Building MyCharts has taken many thousands of people a generation. A small number of people "putting something together" just is not competitive. Even IBM and its enormous resources failed at that.
 
Epic Systems, who owns (or made) MyChart, started in a basement in 1979 with 1 1/2 employees & $70,000! AFAIK they are still privately owned. You don't need tremendous resources and most brilliant people (they might even be detrimental), you just need to be in the right place at the right time and not blow your opportunity!
 
Mr. Bell (inventor of the telephone)
Italian engineer Antonio Meucci and German inventor Philipp Reis independently invented telephone-like devices that achieved the key breakthrough of turning sound into electric signals over a decade before Bell.

DBP_1984_1198_Philipp_Reis.jpg
 
What you need is
  1. A good idea.
  2. Some good engineers.
  3. A lot of (patent) lawyers.
Oh, and enough money to survive a year or two at any point.
 
What you need is
  1. A good idea.
  2. Some good engineers.
  3. A lot of (patent) lawyers.
Oh, and enough money to survive a year or two at any point.
If only. A friend of mine had a great idea, raised tens of millions, had a great team, excellent engineers and they built a solid product but the timing was all wrong -- right when the dot-com bubble burst.
 
If only. A friend of mine had a great idea, raised tens of millions, had a great team, excellent engineers and they built a solid product but the timing was all wrong -- right when the dot-com bubble burst.
That is what most "Everyone can make it if he wants" types get wrong. You need a chance and good luck. Otherwise, you stay stuck.
Or you have a great idea and raise your money on issuing stocks. Very. Bad. Idea.
 
I like: C, C++, C# and Python
I don't hate any programming language, I believe that every programming language has its right to exist.
 
Back
Top