Hacker Newsnew | past | comments | ask | show | jobs | submit | akamaka's commentslogin

Great analysis.

Great to know that all of the existing bugs in Microsoft’s code will be faithfully translated into Rust using LLMs.


Rather than reading this opinion piece, you can learn more about the “debt crisis” by just studying this chart which shows what percentage of the federal budget goes toward paying off the debt:

https://fred.stlouisfed.org/series/FYOIGDA188S

The situation is similar to what it was in the late 1980s, and it can mostly likely be managed with the same level of spending restraints we saw in response to that.


Share of budget is actually a terrible way to look at this because the budget itself has exploded, and that ratio hides most of the real modern risk.

Interest costs in the 80s spiked because high rates were applied to a much smaller debt base. Today we have the opposite problem: rates that are high compared to the 2010s are now rolling onto a massively larger stock of debt. We’ve only just started to refinance that debt at the new levels, so the full impact hasn’t even shown up yet. We are still seeing significant inflation (meaning rates still have upwards room to grow), beginning signs of an economic pullback, are beginning to see signs of a Fed unwilling to raise rates sufficiently due to the impact on the fiscal environ, etc.


If you compare government budget as share of GDP, you can see that is hasn’t “exploded”, outside of crisis periods. Current spending rate is elevated about 25% over the 1990s period of restraint, but quite close to the 1980s.

https://fred.stlouisfed.org/series/FYONGDA188S


You keep switching between flows and stocks and what you want your numerator and denominator to be, why wouldn't I just look at real spending and real debt numbers - ie. the number we ultimately have to pay interest on?

GDP % is only relevant if we are politically able to raise taxes.


Both of the charts I posted have GDP as the denominator (although I incorrectly said the first was “share of budget”).

I think it’s very important to use GDP as a denominator, because otherwise you’ll be stuck crying wolf, saying “debt always keeps going up” even during the good times.

There are a lot of people who simply don’t believe that the government budget needs a trim right now, because people have been continuously saying there was a debt crisis even when the financial situation was relatively favorable.


You know why Japan is fine with 236% debt to gdp?

Because measuring things against GDP like this is completely meaningless.

If you use your brain for even the slightest moment it would be completely obvious that the sum total amount of a debit is a huge deal because of scale of the interest.


IANA economist, but if there were a debt crisis, it would ultimately be about the psychology of the investors who would buy government debt. They want to be very, very confident that they will be paid back (which is why they're willing to accept a low interest rate).

If those investors are satisfied with a return to a late-80s fiscal posture, then great. But if they're worried that spending would just creep up again once the pressure is off, they might "demand" further cuts.


In particular, investors often like to see the contrast of infrastructure development (investing in future GDP), as opposed to paying day to day operating costs, retirements, interest on debt (never mind larger debt as far as the eye can see), and other creative ways to prevent future GDP. And there is very, very little infrastructure development in US budgets.


At current trajectory, interest costs will exceed 50% of all tax revenue within 30 years. See footnote 5:

https://media4.manhattan-institute.org/wp-content/uploads/a-...

The author took the CBO's budget projections and adjusted them for "false sunsets", i.e. the tax cuts that were supposed to expire before they were extended, and the fake spending cuts written into the law that will never happen, i.e. the FRA.


If we're going off of a "share of budget" metric then the fastest way to reduce debt share is to increase spending in other areas!


I was mistaken to say “share of budget”, because the chart I linked to is actually share of GDP, which hopefully isn’t affected by the problem you pointed out.


> can mostly likely be managed with the same level of spending restraints we saw in response to that.

so... austerity? Like the article suggests?


Was the 1990s austerity “severe”? I remember a lot of complaining at the time, but it doesn’t seem too bad in hindsight.


Given the security situation today repeating the peace dividend is not an option, so it would be much tougher than the early 1990s was.


On the contrary, that’s exactly in line with how much the average American spends on holidays:

https://www.pacaso.com/blog/average-vacation-cost


> With M5, Apple Vision Pro renders 10 percent more pixels with the micro-OLED displays

I found this little piece of information interesting. Apparently the display on the Vision Pro has such high resolution that they reduce the detail of the rendering. I don’t think I’ve ever seen that reported before. It means that an even higher quality display is still far in the future, since the silicon to push that many pixels isn’t quite ready.


I think it also does foveated rendering, it detects where you are gazing and renders that area at a higher res than the periphery.


At the same time, the resolution still isn’t quite enough for virtual Mac screen to not look worse than the real one. The Vision Pro has around 34 PPD. “Retina” resolution for VR is generally considered to require at least 60 PPD. That would mean four times the amount of pixels at the Vision Pro’s panel size. Add to that that the Vision Pro’s FOV is relatively small, you’d want even more pixels than that.


It won't look as sharp as a real display at the same size, but it is significantly larger. I've spent probably 3,000+ hours in the headset, with the vast majority of that using the Virtual Display. I've always preferred larger fonts and sitting close to the display, so using the wide virtual display mode has been a fantastic experience for me. My eye-strain-induced migraines have basically disappeared since I started using this as my main display, moving from my previous setup of 30% Studio Display and 70% 16" MBP.


That just means that China started later. Europe is already past 50% and are on the top half of the S-curve where adding additional renewables has diminishing returns.


Look at the absolute values, china added 4X the clean energy as EU. Once the manufacturing of panels is in place they can keep doing it without further investment. That's not diminishing returns, that's actual power every time. Cars don't run on percentages, they run on kWh. There's nothing diminishing


The diminishing return happens when you have so many solar panels that on a sunny day you generate more than 100% of the electricity you can use. Maybe that situation is great if you want to subsidize solar panel factories, but you get less usable kWh for the same cost.

It’s completely expected for Europe’s installation of solar panels to begin tapering off as they get more return on investment by installing battery storage and decarbonizing other parts of the economy.


Then you store that energy or find a way to use it. Melt ore when its abundant, then make metal when it is abundant, then dig holes when it is abundant, then use the metal to turn the hole into a reservoir when it is abundant and eventually use the reservoir to pump in and out water as a way to store the abundant energy for use when its not.



All of these things are an order of magnitude more difficult and annoying than simply storing flammable gas or liquid in a tank and using it whenever you need it.

Not saying we should continue using fossil fuels forever, but being unrealistic about how hard the transition to intermittent renewables will be isn't sensible


Having more generation capacity also makes renewables less intermittent though, becuase for example with enough solar capacity then even on a cloudy day they may produce enough energy to cover demand.

It doesn't solve the problem completely, but it surely helps.


> All of these things are an order of magnitude more difficult and annoying than simply storing flammable gas or liquid in a tank and using it whenever you need it.

There’s quite a bit of complexity leading to the “simply storing in a tank” step.


On the other hand, the additional solar capacity during overcast days might still be worth the additional investment.

Electricity might become free on sunny days, but you'll still have to pay serious money for it during cloudy windless days. Even a solar panel operating at 10% capacity becomes worth the effort.


It’s just a mindless comment by someone who doesn’t keep up with the latest developments. Perovskites already entered small scale commercial production last year and are being deployed in the field to validate how well they hold up in real-world conditions, so it seems we’re only one step away from large-scale deployment.


In the TV show Silicon Valley, there’s a joke that Nelson “Big Head” Bighetti, the perennial underperformer, did his undergrad at ASU. But I guess that’s one thing the show got wrong, because Arizona State University finished among the top three American schools in the world finals.


This seems like a clickbait title because I’ve never hear of a hardware upgrade being called a “patch”.


"The term "patch" came from early use in telephony and radio studios, where extra equipment kept on standby could be temporarily substituted for failed devices." - from https://en.m.wikipedia.org/wiki/Patch_cable

But yeah, the term patch just seems weird in this article. Why not just "upgrade" or "fix"?


I'm not so sure, I thought "patch" originated from hole punching cards to program stuff. A software patch was literally a patch of tape that hides an errorneously punched hole in such a card.

The term patch-cable seems to be way younger.


https://www.merriam-webster.com/dictionary/patchboard

patchboard

: a switchboard in which circuits are interconnected by patch cords

First Known Use

1934, in the meaning defined above


I find it far more likely that patches--in terms of fixing problems with small, targeted changes--derives from the use of the term for fixing holes in cloth by sewing on another piece of cloth.


Everything is about patching up clothes or other things. I was just commenting on the “patch-cable seems to be way younger” remark.


Hence patch cable.


"Service"?


I don't think the patch is hardware. The hardware they're talking about is the "Gameboy like device" that runs the exploit.


> The Verge now reports that Hyundai is offering a security patch for this issue through software and hardware upgrades to Ioniq 5 customers.

You do a hardware upgrade on the car to patch the vulnerability.


The etymology of patch harkens back to Larry Wall's UNIX patch tool for applying diffs to a source code base.


The etymology of patch predates software by hundreds of years.


https://www.etymonline.com/word/patch

> "piece of cloth used to mend another material," late 14th century.

> Electronics sense of "to connect temporarily" is attested from 1923 on the notion of tying together various pieces of apparatus to form a circuit.


tldr: OCaml


The article isn’t really very persuasive about this though. Having worked with OCaml at Jane Street is not, I think most of us would agree, going to be, going to be a serious barrier to getting hired to work with another language somewhere else.

> For Jane Street’s technical rank-and-file, particularly the many hired straight out of university, non-compete agreements may be surplus to requirements. A scan of jobs listed by Millennium, a rival fund that has recently clashed with Jane Street in court, shows the strength of the latter’s position in the job market. Millennium wants engineers experienced in c++, Go, Java and Python, languages that are commonly used across finance and tech. OCaml developers, it seems, are Jane Street’s to keep.

If someone worked with OCaml at Jane Street I would just take this as a signal that they are smart enough to quickly learn Go, Python, whatever they need, and will probably be more successful after 6 months than a “Python developer” would be.


I've experienced this while leaving a different company that used a rare language.

It's a tough situation being experienced in <peculiar language for Company A> when you need to ace technical interviews in <mainstream language for Company B>.

Once you have a few years of promotions, it gets even tougher when you need to compete with <mainstream language> senior+ software engineer candidates at the destination company. Maybe <flashy brand name> was enough to land the interview, but experience mismatches and limitations can remain apparent in the interview itself.


> Having worked with OCaml at Jane Street is not, I think most of us would agree, going to be, going to be a serious barrier to getting hired to work with another language somewhere else.

The retention factor is *not* that other companies wouldn't want to hire them, but rather that these employees are likely to dislike being forced to use something other than OCaml.


Sure you would, but would Millennium or other high-caliber firms? It seems they want engineers with C++ experience and that's not exactly 'easy' to pick up 'quickly'.


That programming language your doctor doesn't want you to know about.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: