undefined !== null ( but native operators like ? and if uses == and not === )
I'm really glad Javascript is taking over. Mostly because I am a small business / indie dev and using Javascript allows me to off-load a lot of work onto distributed client computers. Which, in turn, allows me to piggy back off of Google, Github, etc., for free hosting and be massively "scalable" to traffic spikes with no cost to me. Js has become massively portable so rapid prototyping and deliver is more possible now than ever.
Any decent language that wants devs to be able to reason about operators and types would throw a TypeError here.
Shitty languages will cause devs to have to purchase books entitled "ShittyLanguage: The Non-Shitty Parts", wherein chapter 8 talks about "avoid using the subtraction operator because of ambiguous precedence, transitivity, and coercion rules; instead use jQuery.minus()."
Easy.. don't let shitty programmers write code.. oh, that's right, everyone has to learn/start somewhere.
I've seen some pretty horrible code, with any number of bugs in pretty much every language I've ever seen. If you're passing a string into a function that expects a number, you deserve what you get. parseInt(value,10) for user input isn't so hard.. if you want to ensure a numeric value, you can always ~~value ... though that's slightly less obvious to someone new to the language.
JS evaluation expressions are far nicer than most languages I've worked with.. aside from C#'s addition of .? I can't think of much that comes close to as fluid in terms of handling end user input and massaging it into something that works correctly.
Your comments remind me of the XML everywhere mindset that used to be so prevalent in "enterprise" programming... JSON is a much better abstraction for data models, as it is less disconnect from actual code.
Personally, I prefer a "don't cause an error unless you really have to" approach to development... if you can recover from an error condition, log it and do so... if you can't, blow up the world. Java's error handling comes to mind here as particularly cumbersome to deal with... Node's typical callback pattern, and similarly promises/thenables is much easier to work with in practice.
JS has some really hideous parts... just the same, the Browser is an environment where you expect things to do "something" and mostly still work when parts break... the services that back browsers should likely do the same. JS is a good fit for this use case.
I really like this "1-1" answer. That's very creative :)
But honestly, compare each one of these answers to js's official workaround calling the Number() constructor on the primitive string value and attempting to convert to a number since the subtraction operator is only defined for number and then proceeding to complete the operation.
For me, it looks very logical and convincing given that the language is loosely typed and it breaks her heart that it fails any of its loved users :)
1 + "2" (12) does a string concatenation.
1 - "2" (-1) does math.
That said, whilst JS is loosely typed and won't fall over when you do this, I'd just see it as bad form to find code mixing types to this extent. Just because the language will let you do it, doesn't mean you should actually do it!
To be fair, the + issue is a mistake many other languages have made too. Incredibly, PHP, poster child of bad languages, gets this right and splits + and . (though this may just be because it copied Perl).
It's fine to have a (string) + (string) operator, it's just generally a less than brilliant idea to have a (string) + (arbitrary) operator that does an implicit string conversion.
But all of that's fine compared to PHP's implicit number conversion that ignores trailing characters, presumably so that you can add "3 onions" to "1 kg of bacon" or some such...
48 might have been a much more reasonably sane answer, or just disallow operations that don't really make any sense, instead of implicitly casting in non-obvious ways.
When you see a character, it usually has an underlying representation an unsigned (or byte or array of bytes or int if that's your thing, with size depending on ASCII, Unicode, etc). In both UTF-8 and ASCII the character "1" has a value of decimal 49. If the language actually allows you to subtract a number from a character (or length-one string, depending on language semantics), which is dubious to allow anyways, the expected behavior should be to return 48, but that's really a code smell to even attempt that operation.
Because JavaScript was first and foremost a language for validating user input... user input will always (well originally) be a string. In this light the type coercion choices JS makes are usually pretty sane. Given that the Browser is an environment where a user expects things to mostly still work in the case of an error (in formating markup, etc), having that carry through to the language is a pretty logical choice.
Given those two things, these edge cases are entirely sane... and given the flexibility of the language makes it a very good choice for a lot of tasks. No, it is not a purist mindset when it comes to languages.. but in the context of its' design goals, it makes perfect sense and is easy enough to avoid these scenarios when actually writing code that doesn't interact with user data.
For the record, I've seen C# developers pass around Guid (UUID) as string, even for code that never crosses a wire, and the dbms uses the UUID format. I've seen Java developers do some pretty asinine things too. The language can't always protect you from shooting yourself... in the case of JS for the most part at least a small bug doesn't take the building down with it.
String.prototype.charCodeAt and String.fromCharCode are never called in implicit type conversion. Type conversion is always done by calling primitive constructor functions. In the case of subtraction ("1" - 1), Number is called on any non-numeric operand. Also see:
If you're going to insist that a character (or string) is it's own magic datatype _and is not an unsigned (or similar) underneath it all_ and you're going to talk about high level niceties, then your interpreter really really should not violate principle of least astonishment. There is no sane way to frame "1" - 1 _unless you explicitly typecast the string to a number_, because you now have to reconcile it with what should be identical behavior for numeric types, like "1"+(-1), which guess what, yields "1-1" in javascript, which is the definition of insane. You've also got to deal with other less obvious cases like when the string is a in a var and is not _always guaranteed to be a nice number_ etc, which really really makes ever using anything like that a code smell. It's far easier for both the programmer and the interpreter _not_ to play the guessing game, and try and implement inconsistent numeric behavior, than to just say "well I'm not going to do this unless you really insist (via an explicit cast) that you want this".
Having automatic type casting of the form we've seen above present in the language is like having a gun without a safety--it's that 1% of the time that the pin inadvertently strikes the shell that you will really really wish you had had a language that would have faulted rather than silently proceeding with broken logic and now potentially disastrously bad data. I can't believe that anyone would pick a language like this that would allow you (especially silently) to be this sloppy.
Now tell me, what is f("1",2)? Is this really what you would expect? Treating strings like strings sometimes and integers other times messes up functions, creating bugs that are extremely hard to track. Usually you can get back what you started with if you subtract after you add, but JavaScript ruins that unless you first check that the input are numbers. If it always tried to coerce the string into an integer or vice versa then it would be fine.
I know that at face value, things look messy and unpredictable but if you really know js in and out, you'd guess the right answer (1) (Check operator precedence and associativity for reference)
The problem with people coming from more classical programming languages to learn js is that some have this condescending attitude toward the language from the get go and expect that by the virtue of having C/Java experience under the belt, that everything should look similar in js and when they encounter something like "automatic type casting" they freak out and start dissing the language but once they seriously put the effort to understand it, their frustration and bad experience starts to give away to a more positive experience and consequently more positive sentiment toward the language.
So for me it's just a question of attitude and story of prejudice and perceived supremacy of one's own language background at play here.
It isn't hard to see the answer, but that wasn't my point. The point is that you want predictable behavior from your functions. If you saw the output of said function for integers you could never have guessed the output of said function with the combination of a string and an integer.
JavaScript forces you to work more to get predictable type conversions by having a lot of random conversions. While each of those conversions might make sense the combination of them usually doesn't. Why can I use other maths operators to coerce the string into a number but not '+'? Because '+' is a special case since it tries to coerce to a string before it tries to coerce to an integer, it isn't hard to get that. But it makes the other conversions dangerous since when you want to use '*', '-' or '/' you usually want to use '+' as well but as it is the other works but not '+'. What would make sense is to either remove the special case of '+' or you add special cases for the other operators so that they act similarly as '+' with strings.
1 + "2" = "12" but "1" - 1 = 0
0 == false ( "if window.scrollTop ..." will occasionally break on user scroll )
undefined == null ( why do we need 2 empty sets? )
undefined + "dog" != null + "dog" ( breaks transitive property )
undefined !== null ( but native operators like ? and if uses == and not === )
I'm really glad Javascript is taking over. Mostly because I am a small business / indie dev and using Javascript allows me to off-load a lot of work onto distributed client computers. Which, in turn, allows me to piggy back off of Google, Github, etc., for free hosting and be massively "scalable" to traffic spikes with no cost to me. Js has become massively portable so rapid prototyping and deliver is more possible now than ever.