Hacker Newsnew | past | comments | ask | show | jobs | submit | vincetogo's commentslogin

I suspect that the augmentation of human intelligence through tech is something we're more likely to get to before full-on AI. Assuming it's unaffordable for most of us, I'm much more concerned about a caste of super-intelligent, super-rich humans than computers that have no history of violence or hunger for power.


A more accurate title might be something along the lines of "Optimizing Bottlenecks Improves Performance".


It's not okay to be lazy. But it's wise to prioritize architecture over performance until you have numbers to show you where you should put necessary optimizations. In my experience, optimized code is almost always harder to work with, so there better be a good reason to write it that way.

It's a lot easier to optimize well-architected code than to re-architect optimized code.


Shouldn't that be radioactivity? Light is a form of radiation, and I'm pretty sure that webcams can already measure that.


I really don't get where the hate for Objective-C is coming from. Objective-C is definitely showing its age and I'm glad to see a new language that improves on it, but I really don't understand why the author's slagging it so much. I've worked in Java and Obj-C and I'd much rather work with the latter any day. I'm not saying it's perfect by any stretch, and I'm not saying it's objectively better than Java, just that there's a case for both. But I definitely don't think iOS would have better off if it had adopted Java as its language of choice instead of Objective-C.


C++ made certain decisions about syntax, Java kept many of them. They have been dominant languages for many years.

Objective-C made different decisions. They aren't clearly better or worse, just different. So C++/Java developers have to learn to read a new syntax and have to give up all of their favorite tools.

After all that work, they end using something that's basically the same as their old language. So they aren't thrilled about the change.


You're right, the syntax is the main thing I've heard complaints about. I find that really bizarre though. I've been working in C++ longer than Objective-C, and more than Objective-C, but I don't find anything offensive about the latter's syntax. So it's different. For me the measure of a language is how hard or easy it is to get stuff done. Objective-C holds up pretty well in that regard.


I really don't see this being the case. Any competently designed graphics engine has an abstraction layer between the app's graphics routines end the platform's graphics API. While it's not trivial, another API shouldn't be something that prevents an app from being ported to other platforms.


Did you completely miss just how much DirectX locked games into the Windows platform?


You mean the same that were equally available on Wii, PS 3 and XBox 360?


There's a bit more to it than that, though.

There wasn't much of a a perceived interest in gaming on the Mac and Linux platforms. The Mac and Linux graphics drivers were awful. A vicious cycle, which we're (hopefully) seeing start to break now. Mac is supported by at least some major games, which is a step forward. The Steam box should speed things along nicely for Linux, hopefully.

As pjmlp points out, games were successfully ported to different consoles, it's just that Mac and Linux weren't seen as worthwhile targets.

I suspect porting a Windows game to PS3 would be much harder than porting to Mac, but developers managed.


Did Apple deprecate OpenGL on iOS? If not you should still be able to work with it if it does what you need.


I'm pretty sure they didn't. My concern though is that OpenGL will fade away.

OpenGL isn't exactly a product. There's no company that "writes" the OpenGL software. Rather, it's a specification published by a consortium. "Writing" the OpenGL libraries is a task that each GPU maker does independently. So you have a bunch of different implementations of the same API.

For a long time, GPU makers have focused their attention on DirectX and done a lackluster job with their OpenGL implementations. If APIs like DirectX and Metal continue to proliferate, there will be less and less time and less incentive to maintain a good OpenGL implementation.


It's not too hard to write a graphic engine that abstracts out the graphics API, but it would have to be done in a language other than Swift...


Steve Jobs was a white, male billionaire. Of course he wouldn't be in jail.


This guy is putting on his white overcoat and trying to look like a scientist, but he doesn't actually cite any real research in favour of his claim, and instead uses anecdotal evidence ("Usually, once the original condition is found and treated, the ADHD symptoms go away" and such) rather than statistics to support his conclusion and sell his book.

This isn't to say he's incorrect in his conclusion, he just isn't supporting it very well. I would hope the book has referenced stats but this article isn't very promising. He's not too far removed from a pilot claiming his story about a flying saucer must true because he was in the airforce.

For the record, I've been diagnosed with ADHD and have plenty of anecdotal evidence that it's for real, but may be suffering from confirmation bias.


Yes I agree, in fact I think it is particularly concerning that these op'eds are getting picked up by more recognized news outlets. He wrote an almost identical piece for the daily mail last week.

http://www.dailymail.co.uk/health/article-2577814/The-eminen...

this is clearly just self promotion


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: