>Makes Brendan Eich look like a total clown in comparison.
To be fair, Brendan Eich was making a scripting language for the 90's web. It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.
Most of the blame should go to Netscape management. They didn't give Eich much time, then burst in before he was done and made him copy a bunch of things from Java. (The new language, codenamed "Mocha" internally, was first publicly announced as "LiveScript", and then Sun threw a bunch of money at Netscape.)
IIRC, Eich was quite influenced by Python's design. I wish he'd just used Lua - would likely have saved a lot of pain. (Although, all that said, I have no idea what Lua looked like in 1994, and how much of its design has changed since then.)
It sounds like you're saying Yoz got the sequence of events wrong, and that MILLJ was a necessary part of getting scripting in the browser? I sort of had the impression that the reason they hired you in the first place was that they wanted scripting in the browser, but I wasn't there.
I don't think Lua was designed to enforce a security boundary between the user and the programmer, which was a pretty unusual requirement, and very tricky to retrofit. However, contrary to what you say in that comment, I don't think Lua's target system support or evolutionary path would have been a problem. The Lua runtime wasn't (and isn't) OS-dependent, and it didn't evolve rapidly.
But finding that out would have taken time, and time was extremely precious right then. Also, Lua wasn't open-source yet. (See https://compilers.iecc.com/comparch/article/94-07-051.) And it didn't look like Java. So Lua had two fatal flaws, even apart from the opportunity cost of digging into it to see if it was suitable. Three if you count the security role thing.
Yes, the Sun/Netscape Java deal included MILLJ orders from on high, and thereby wrecked any Scheme, HyperTalk, Logo, or Self syntax for what became JS.
Lua differed a lot (so did Python) back in 1995. Any existing language, ignoring the security probs, would be flash-frozen and (at best) slowly and spasmodically updated by something like the ECMA TC39 TG1 group, a perma-fork from 1995 on.
Well, I'm not dead yet! Looking for work, which is harder since I'm in Argentina.
Flash-freezing Lua might not have been so bad; that's basically what every project using Lua does anyway. And by 01995 it was open source.
In case anyone is interested, here's a test function from the Lua 2.1 release (February 01995):
function savevar (n,v)
if v == nil then return end;
if type(v) == "number" then print(n.."="..v) return end
if type(v) == "string" then print(n.."='"..v.."'") return end
if type(v) == "table" then
if v.__visited__ ~= nil then
print(n .. "=" .. v.__visited__);
else
print(n.."=@()")
v.__visited__ = n;
local r,f;
r,f = next(v,nil);
while r ~= nil do
if r ~= "__visited__" then
if type(r) == 'string' then
savevar(n.."['"..r.."']",f)
else
savevar(n.."["..r.."]",f)
end
end
r,f = next(v,r)
end
end
end
end
It wouldn't have been suitable in some other ways. For example, in versions of Lua since 4.0 (released in 02000), most Lua API functions take a lua_State* as their first argument, so that you can have multiple Lua interpreters active in the same process. All earlier versions of Lua stored the interpreter state in static variables, so you could only have one Lua interpreter per process, clearly a nonstarter for the JavaScript use case.
The Lua version history https://www.lua.org/versions.html gives some indication of what a hypothetical Sketnape Inc. would have been missing out on by embedding Lua instead of JavaScript. Did JavaScript have lexical scoping with full (implicit) closures from the beginning? Because I remember being very pleasantly surprised to discover that it did when I tried it in 02000, and Lua didn't get that feature until Lua 5.0 in 02003.
> It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.
Which remains one of the most baffling decisions of all time, even to this day. Javascript is unpleasant to work with in the browser, the place it was designed for. It is utterly beyond me why anyone would go out of their way to use it in contexts where there are countless better languages available for the job. At least in the browser you pretty much have to use JS, so there's a good reason to tolerate it. Not so outside of the browser.
> To be fair, Brendan Eich was making a scripting language for the 90's web.
He was, and he doesn't deserve the full blame for being bad at designing a language when that wasn't his prior job or field of specialization.
But Lua is older so there's this element of "it didn't need to be this bad, he just fucked up" (And Eich being a jerk makes it amusing to pour some salt on that wound. Everyone understands it's not entirely serious.)
"Silicon Valley" is not an actor (human or organization of humans) that decided any such thing. This is like saying a virus decides to infect a host. JS got on first, and that meant it stuck. After getting on first, switching costs and sunk costs (fallacy or not) kept it going.
The pressure to evolve JS in a more fair-play standards setting rose and fell as browser competition rose and fell, because browser vendors compete for developers as lead users and promoters. Before competition came back, a leading or upstart browser could and did innovate ahead of the last JS standard. IE did this with DHTML mostly outside the core language, which MS helped standardize at the same time. I did it in Mozilla's engine in the late '90s, implementing things that made it into ES3, ES5, and ES6 (Array extras, getters and setters, more).
But the evolutionary regime everyone operated in didn't "decide" anything. There was and is no "Silicon Valley" entity calling such shots.
> "Silicon Valley" is not an actor (human or organization of humans) that decided any such thing.
Oh come on, you understand full well that they're referring to the wider SV business/software development "ecosystem".
Which is absolutely to blame for javascript becoming the default language for full-stack development, and the resulting JS-ecosystem being a dysfunctional shitshow.
Most of this new JS-ecosystem was built by venture capital startups & tech giants obsessed with deploying quickly, with near-total disregard for actually building something robustly functional and sustainable.
e.g. React as a framework does not make sense in the real world. It is simply too slow on the median device.
It does, however, make sense in the world of the Venture Capital startup. Where you don't need users to be able to actually use your app/website well. You only need that app/website to exist ASAP so you can collect the next round of investment.
Companies including Bloomberg and Microsoft (neither in or a part of Silicon Valley), also big to small companies all over the world, built on JS once Moore’s Law and browser tech combined to make Oddpost, and then gmail, feasible.
While the Web 2.0 foundations were being laid by indie devs, Yahoo!, Google, others in and out of the valley, most valley bigcos were building “RIAs” on Java, then Flash. JS did not get some valley-wide endorsement early or all at once.
While there was no command economy leader or bureaucracy to dictate “JS got on first but it is better to replace it with [VBScript, likeliest candidate]”, Microsoft did try a two-step approach after reacting to and the reverse-engineering JS as “JScript”.
They also created VBS alongside JS, worked to promote it too (its most used sites were MS sites), but JS got on first, so MS was too late even by IE3 era, and IE3 was not competitive vs. Netscape or tied to Windows. IE4 was better than Netscape 3 or tardy, buggy 4 on Windows; and most important, it was tied. For this tying, MS was convicted in _U.S. v. Microsoft_ of abusing its OS monopoly.
Think of JS as evolution in action. A 2024-era fable about the Silly Valley cartel picking JS early or coherently may make you feel good, but it’s false.
I don't think JavaScript will replace all application development in the future. WebAssembly will displace JavaScript. With WebAssembly you can use whatever language you like and achieve higher performance than JavaScript.
To be fair, Brendan Eich was making a scripting language for the 90's web. It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.