Hacker Newsnew | past | comments | ask | show | jobs | submit | jisnsm's commentslogin

> A NULL pointer dereference occurs when software attempts to access memory at address 0 (the NULL address) via a pointer set to NULL.

There is nothing in the standard that says that NULL must be 0.


When's the last time you've seen a non zero NULL pointer?

We don't code for things like PDP-11 or Honeywell mainframes all that often.

https://c-faq.com/null/machexamp.html


Here is a non-zero null pointer: https://gcc.godbolt.org/z/Po5r5Pa36


whats going on here, I dont know enough C++ to understand


Maybe in progressive circles. There’s life outside too.


Nah it's pretty much every circle in Europe.

Macron in France and the current German government are centre right. Poland is socially way to the right. Baltics, Scandinavia, Italy, all pretty right wing.

Only Putinistas support what the US is doing with Trump, which is to say, almost no one in Europe.


Macron is not right wing by absolutely any sane definition of the concept. Can’t say anything about the others.


Macron is definitely centre-right...

He's literally a former investment banker who's been trying to dismantle the welfare state, lower taxes, increase business friendliness, etc... He's also been tough on immigration lately. Not sure in what world any of this would be considered leftist policies...


Personal media, cough cough


There is no cough cough here. I own a non-insignificant amount of dvds and blurays, which plex helps my family watch in a relatively painless manner. But they seem to try to make it more painful.


Beware, once you get used to watching blurays streaming video looks like crap.


Yes, unions at video game studios are bound to save so many lives.

In other words your comment is hyperbolic and does not apply to the topic at hand.


Given the pattern of video game studios unceremoniously laying off people the second a game is complete and how health care in the US is overwhelmingly tied to employment, it's extremely plausible that somebody somewhere has died from lack of collective bargaining power. If it hasn't happened, that's most likely because people just leave the video game industry before they actually need those health care benefits.


>Yes, unions at video game studios are bound to save so many lives.

Stress & constant overtime crunch culture will absolutely be a detriment on the health and wellbeing of workers in any industry.

Again, labor laws saved lives in the past, they can save and/or benefit lives today.

>In other words your comment is hyperbolic and does not apply to the topic at hand.

Welcome to the topic at hand.


>Again, labor laws saved lives in the past, they can save and/or benefit lives today.

They can, but game dev is not a critical national industry that politicians are gonna fight for with laws to protect labor. Otherwise we could have had unionized clothes making union but what saw instead was the entire textile industry shipped oversees. Game dev will follow a similar fate.

You can unionize if you want, but unless you're guarantee to have a blockbuster IP on your hands capable of raking in billions, you won't be able to compete with game devs from lower CoL countries.

In a globalized free market with no tariffs, high CoL labor can't compete with low CoL labor making commodity goods, which a a lot of games are nowadays. Unions won't fix this, but accelerate offshoring at the expense of the local industry.


> you won't be able to compete with game devs from lower CoL countries.

I mean, those devs might unionize as well. Certainly would've helped both the ZA/UM and CD Projekt Red devs, for different reasons.


>I mean, those devs might unionize as well.

The word "might" is doing a lot of heavy lifting here. If unionization in Asia was so easy we would have seen it happen a long time ago, but what we saw instead was suicide nets on buildings.


That caricature was so long ago a new caricature (from seven years ago) has emerged to lampoon it:

https://www.youtube.com/watch?v=dpB9Aeq8lUA

Granted, I have no doubt that the work culture is much tougher than in North America. But even the Chinese government has recognized 996 and taken steps to address it.

https://www.channelnewsasia.com/east-asia/china-neijuan-invo...

Also, Foxconn manufacturing isn't exactly the same thing as NetEase video game development, even if they both exist within the same labor environment.


>That caricature

Foxconn workers killing themselves by jumping off their office budling is a caricature to you? How out of touch can you be?


Those tragedies largely ended after 2011. One would assume things have changed since then.

https://archive.nytimes.com/www.nytimes.com/interactive/2012...

https://www.foxbusiness.com/features/apple-foxconn-factory-c...


Really it makes no sense that Google would want to siphon money off to Israel?


If iOS only allows you to build a worse experience, then Apple customers will leave the platform for Android. The problem solves itself.


I don't think this is true. Apple could drop smart watch support entirely and I still wouldn't use an Android phone. I personally find the user experience infinitely better on iOS than on Android, and Apple would have to drop the ball very bad to get me to switch.

(Note: This isn't because Apple is without faults. iOS and macOS are both a mess right now, and iPadOS is even worse. I just think that Android is worse than that, and I know many, many Apple users are in the same boat)


Same. I actually like that Apple locks down everything to their own devices, in general, because I believe (from my limited knowledge; I am no insider) it's more secure. Perhaps not, but I trust Apple to release products that are fairly secure, and update them for several years. Whereas with Android, I'd have to trust the phone manufacturer, Google (ewww), and all of the companies that have bloatware installed by default. I do wish there were more 3rd party integrations for those who want them (without sacrificing security), but as for me I am perfectly happy giving Apple my money to get good hardware and decent software that works together well (way better integration than anything in the Android/Linux/Windows world).


Similarly if Apple opened up every API and allowed every smart watch to do whatever it wanted, I'd still prefer an Apple Watch. I tried using a Garmin and "not being able to send an sms" isn't even on the list of things I disliked about it. Ugly clunky interface, pogo pin charging, a companion app that at times wouldn't look out of place on a Windows CE smartphone circa 2006, etc.


what's so bad about android


I don't like that it's made by an advertising company, like the other commenter said. But more than that it's that it's wildly unpolished and inconsistent.

OEMs and carriers shove in their own apps (Samsung is especially bad about this: I don't want two apps for photos, and files, and messages, and calling, and browsing, etc etc). You can (sometimes) disable or uninstall them, but they can pop up again after updates, and I don't want to have to clean up my device just to use it.

And visually, apps look and feel radically different, all over the place. There are apps that still look like they're running on Jelly Bean, apps that use modern material designs, apps that roll their own UI, and web apps in wrappers. Every new app I have to learn how to use it. This is an occasional problem on iOS, but it's very rare compared to my experience with it on Android.


you should try an AOSP distro like GrapheneOS or CalyxOS


Operating system made by advertising company is why it's a deal breaker for me


GGP mentions "user experience" being "infinitely better". I don't think Android being made by an advertising company has much if anything to do with it.

I also don't see iOS and Android having much of a usability gap. At this point, they have very similar feature sets, and the UX is fairly well-polished, even on Android -- where yes, it took them a lot longer to get there. For the most part, if you think that either platform has bad UX, it's probably just because you've used the other one for so long, and you're used to it. (I don't think iPhone usability is bad, but on the rare occasion I do something on my wife's iPhone, I find it frustrating because it just works differently than my Android phone.)

At this point I think most (US; can't speak for other countries) iPhone users are there mainly because they've always been there, and there's fairly strong lock-in and switching costs. And iPhones are still something of a status symbol, not to mention unnecessary Apple-created problems like the "blue bubble envy" nonsense.


> iPhone users are there mainly because they've always been there, and there's fairly strong lock-in and switching costs

Or perhaps it's because we like iPhones?


shocker huh? After owning a few iPhones since 2007, I used and developed for Android for years after release in 2010. I despise it. I switched BACK to iPhone and fully embraced the ecosystem years ago (macOS, iOS, ipadOS) and haven't regretted one second of it. I AM an apple fan boy. Why? because i love using my devices and working within this ecosystem a hundred times more than any other options available. The anti-Apple cult is obnoxious. Just don't use them if you don't like them.


Agreed, for me it's idealogical not strictly technical. They both seem pretty functional


Don't search for "apple ad business" or you'll be very disappointed. :)


I am of course, but at least they have revenues not tied to spying. I'm not a corporate fanboy so all of this stuff disappoints me, just not going to make the perfect the enemy of the good


No, because it's much lower friction to "just" give up and buy an Apple Watch (or just do without), even if you don't like it and think that the features or design a third-party watch are better for you. Or at least could be better, if not for Apple's anti-competitive practices.

The problem is that people don't really have choice. Both iOS and Android have positives and negatives, and often those positives and negatives are not the same. Choosing one or the other is going to have you missing some positives you want, and taking on some negatives that bug you.

If this was just the nature of how things have to be, I'd be more sympathetic. But the real reason it's this way is due to anti-competitive behavior on the part of Apple. There are no technical limitations; it's just their business model to restrict what people can do with the device they've bought. There are certainly some valid security reasons for doing this in some cases, but most of it is just to protect their revenue streams.


You couldn't pay me to go back to Android, having used Android from 2009-2020. Apple Watch is fantastic, I'm a little sad that they don't provide better integration capabilities to external devices. I can only assume that's another anti-competitive lawsuit brewing.


It doesn't if there are only two choices and consumers don't like either.


That’s not Apple’s fault or responsibility. Maybe Android should stop sucking.


Sure it is. Both Apple and Google, through various tactics, have ensured that it's virtually impossible for a third smartphone OS to be successful to anywhere near the level they have been.

Android is fine. It has some downsides vs. iOS, and some advantages. But that isn't the point. The point is that to make a new smartphone OS (or even one that's based on Android, but is independent of the Google ecosystem) that can do everything Android and iOS can do is an undertaking that few would even bother to take on. That's not due to technical challenges, it's due to market barriers that Apple and Google have erected. (IMO, the sorts of things that we as citizens in a healthy society should not allow corporations to do.)

And those that (sorta?) do try to make a competing OS, like LineageOS, GrapheneOS, CalyxOS, etc., end up with far less-capable phones than a Google-blessed Android phone. (And when most/all of those capabilities are present, it's through brittle hacks and compromises that basically turn the phone into an imitation of the Google-blessed phone, with many of the downsides intact.)

Put another way, it's not Apple's or Google's responsibility to make things more competitive, but it is their responsibility to not make things anti competitive, and it is their fault when alternatives don't exist because of their anti-competitive behavior.


Isn't this ignoring the lock-in factor? Leaving Apple is probably more than just switching a single piece of hardware for many users. The entire Apple ecosystem encourages "buying in".

As a few examples

* (almost all) bought apps don't transfer

* bought media (music, etc) and how that integrates into the software

* icloud and other account services

* replacing your phone + laptop + watch + IOT devices which may all be in the apple ecosystem.

So one can easily see how folks who have bought in are willing to put up with user-hostile actions.

Of course, Apple is not the only company that uses integration as a way to retain customers. However, from personal experience, I feel Android is a bit more open (at the cost of a more fractured experience). I can definitely understand the pros of not having to deal with carrier installed garbage when purchasing a device.


How long is the problem going to take to solve itself?


From my argument it follows that Apple customers don’t care about this.


No I won't, Android is not a good user experience. I'd rather buy an Apple Watch.


And this is exactly the problem. Apple presents many of their users with bad choices: either buy an Apple Watch and suffer from its downsides, or switch to Android, and suffer in other ways. Or stick with the iPhone, buy a third-party smartwatch, and suffer from and unnecessarily-crippled user experience.

There's no technical reason it needs to be this way. Apple just prefers to be anti-competitive and increase their profits, than to give their users the as-close-to-ideal experience they want.


Most devices allow you to add CAs, but almost all apps nowadays use certificate pinning which means the system certificate store is ignored. I find it extremely surprising that YouTube doesn’t do that.


That sounds like you've just made it so your app doesn't work behind a corporate SSL proxy. I really need people to stop rolling there own SSL stores (looking at you python, java and nodejs). I spend way to much of my time getting things running on my work laptop that should just use the CA store IT pre-installed.


Is that a problem? What segment of Google's Apple TV revenue comes from people behind shitty middleboxes?

YouTube won't work on Chromecast if you're trying to MitM it, so clearly Google doesn't think this situation is worth making an exception for in their logic.


I can't help but wonder if any apps have tried doing TLS-in-TLS, with the outer TLS not caring about MITM, and the inner TLS doing certificate pinning?


> but almost all apps nowadays use certificate pinning which means the system certificate store is ignored

Certificate pinning (or rather, public key pinning) is technically obsolete and browsers themselves removed support for it in 2018. [1] Are there many apps still really using this?

[1]: https://en.m.wikipedia.org/wiki/HTTP_Public_Key_Pinning


HPKP, yes. Certificate pinning in apps is the norm.

The difference between HPKP and certificate pinning is that HPKP can pin certificates on the fly, whereas certificate pinning in apps is done by configuring the HTTPS client in the native application.

Apps like Facebook won't work on TLS MitM setups without using tools like Frida to kill he validation logic.


Mobile apps still frequently do, yes.

It's gotten less popular over the years as people keep asking "wait, what are we doing this for again?"; but it's still very popular in certain kinds of apps (anything banking related will almost certainly have it, along with easily broken and bypassed jailbreak detections, etc).


Most personal banking apps I’ve used still do this. The bank is liable for your lost funds if your corporate IT department doesn’t secure the MITM solution properly otherwise.

(The end customer isn’t liable for the bank’s inability to properly secure their app from MITM attacks…)


I don't have any numbers, but I think this is still pretty common. On iOS for example Alamofire which is a popular network stack, still offers this as a feature. I think the use case is a bit different for apps and web sites, especially for closed ecosystems like Apple's where reverse engineering is not as easy/straightforward.

https://github.com/Alamofire/Alamofire


> I find it extremely surprising that YouTube doesn’t do that.

Not surprising for me - it used to be only banks where it was required (sometimes by law) that any and all communication be intercepted and logged, but this crap (that by definition breaks certificate pinning) is now getting rolled out to even small businesses as part of some cyber-insurance-mandated endpoint/whatever security solution.

And Youtube is obviously of the opinion that while bankers aren't enough of a target market to annoy with certificate pinning breaking their background music, ordinary F500 employees are a significant enough target market.


Certificate pinning seems like extreme overkill for nearly all applications. Are most folks really doing this?


A regime can now force you to install their "root certificate" (and forcing organizations under their rule, e.g. national banks) to use a certificate issued by them, and these certificates would also be able to MITM your connection to e.g. Google. (1)

Looking forward to Americans being forced to install the DOGE-CA, X-CA or Truth-CA or whatever...

1) https://blog.mozilla.org/netpolicy/2020/12/18/kazakhstan-roo...


I work for a hosting and I know what this is like. And, while I completely respect that you don’t want to give out your resources for free, a properly programmed and/or cached website wouldn’t be brought down by crawlers, no matter how aggressive. Crawlers are hitting our clients sites all the same but you only hear problems from those who have piece of shit websites that take seconds to generate.


git blame is always expansive to compute; and precomputing (or caching) it for every revision of every file is going to consume a lot of storage.


I guess for computationally expensive things the only real option is to put it behind a login. I’m sure this is something SourceHut doesn’t want to do but maybe it’s their only decent option.


On SourceHut, git blame is available while logged out, but the link to blame pages is only showed to logged-in users. That may be a recent change to fight scrapers.


Precomputing git blame should take the same order of magnitude of storage as the repository itself. Same number of lines for each revision, same number of lines changed in every commit.


Should be easy to write a script that takes a branch and constructs a mirror branch with git blame output. Then compare storage space used.


It is more fun to fight LLMs rather than trying to create magical unicorn caches that work for every endpoint.


“No! Don't deal with it like those dead people do. Come on!”


Because disgusting people spitting gum on the floor is obviously the corporation’s fault.


There's a popular drive through coffee and fast food franchise in my country and the immediate area around is often covered in their empty cups and food wrappers that their customers discard when they get new stuff from the drive through.

Funny thing is that the waste is often found around the store and not nearly as much in other places.

I personally blame the company that produces the garbage with their branding on it and who has a profit motive to not deal with this stuff.


No, but the fact that the gum then never degrades is.


That line of thinking doesn't get us towards no gum on the floor, and surely in our collective capacity we can do something about it?


>>> That line of thinking doesn't get us towards no gum on the floor

Sure it does.

We can collectively shame people who spit gum on the floor. We could also enact something like singapore that carries hefty penalities.


but who profits by selling the gum?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: