Hacker Newsnew | past | comments | ask | show | jobs | submit | zero_bias's commentslogin

I’m author of relatively popular open source project (4.8k stars, 100k+ downloads/months), lived on donations for five years. I use and am eternally grateful for the following oss plans:

- Unlimited browserstack. This would cost thousands of dollars

- Free netlify hosting. Server side analytics is still $9/m, but anyway

These plans have one thing in common: they are not limited in time. Open source cannot be built on an unstable foundation.

The six-month anthropic offer is just ridiculous. Bland PR move, I can’t express how miserable this plan is. It just not for us


it's not just a pr move. they want you to train their model. and their estimation is 6 months of data is enough.

ISS and MIR combined are not a "large market". How many radiators they require? Probably a single space dc will demand a whole orders of magnitude more cooling


ISS cost $150B and a large factor driving that cost was the payload weight.

Minimizing payload at any point was easily worth a billion dollars. And given how heavy and nessisary the radiators are (look them up), you can bet a decent bit of research was invested in making them lightweight.

Heck, one bit of research that lasted the entire lifetime of the shuttle was improving the radiative heat system [1]. Multiple contractors and agencies invested a huge amount of money to make that system better.

Removing heat is one of the most researched problems of all space programs. They all have to do it, and every gram of reduction means big savings. Simply saying "well a DC will need more of it, therefore there must be low hanging fruit" is naive.

[1] https://llis.nasa.gov/lesson/6116


At least 50 cent didn’t make TikTok style music with single polished part and junky rest of the song. Nowadays his goals looks almost innocent


In rust you could use multiple allocators at the same time. Allocation failure handled by allocator, converting panic to some useful behavior. This logic is observable in WASM, as there are OOMs all the time, which handled transparently to application code

So I assume there is no real blockers as people in this tread assume, this is just not a conventional behavior, ad hoc, so we need to wait and well defined stable OOM handlers will appear


It’s important for electronic music to have consistent and predictable pitch, otherwise djs on stage will have hard time to play (they loop a start of the song and play it together with tail loop of previous song), so Daft Punk need to intentionally choose fractional BPM as mastering engineers will not change pitch even slightly


DJs routinely change the playback speed to match tempos.

It’s easier if a track’s tempo stays the same throughout its duration, but even if it changes, DJs will adjust the playback speed on the fly.

As far as syncing is concerned, the actual value of the tempos doesn't matter at all.


Usually top electronic studios use external clock sync device which prevents that kind of issues, I’m sure that Daft Punk uses it too


Probably not on this album.


More than likely they did, sync boxes have been around for a long time, they're not that expensive (would have been in the hundreds of dollars or euro at the time), and Daft Punk could surely have bought or borrowed one if they wanted. I was just having a chuckle at the blog author's idealism about how well sync works in the real world. If they were using MIDI, the standard allows for a 1% timing variance at the hardware level (not 1% of 1 beat, 1% of the tempo). I would guess Daft Punk were more likely using old 'classic' synths with control voltage, which is often a bit more reliable.


Why not? It’s a common equipment and it’s not count as "digital device forbidden in analog studio" as you connect synth directly to it, just to make sure that your front waves are in sync


First, it's not that common. These are specialized tools you won't find in every producers tool belt. Not everyone cares that much about midi clock accuracy or many people will just circumvent the problem. Also, like homework, Discovery is very much a homestudio album, recorded at Bangalter's place. That's not what "top electronic studios" are. Ultimately we don't know and this whole thread is like seeing pictures in clouds to me.


Actually, it is very common - the DAW is usually used as the master clock to external devices such as drum machines, sequencers and synths with onboard arpeggiators/sequencers, and the DAW is itself commonly synced to the high end clocks in a decent quality audio interface, which do not have as much jitter these days as most folks seem to think. Even in the 90’s, this was a feature of many audio interface and DAW rigs. Bangalter was not known for having cheap gear.


> audio interface, which do not have as much jitter these days

But it's not about audio jitter.

Anyway, like I said, too much speculation in this thread.


It’s about MIDI jitter, and if the MIDI clock is locked to the audio clock, its going to be pretty darn good.


That's a big "if". And then there is the jitter induced by the gear receiving the clock. Many drum machines are bad there. See https://innerclocksystems.com/litmus/


It’s really not a big if, it’s currently an actual reality of modern audio interface/DAW configurations. Very few modern audio interfaces do not have MID clock very tightly bound to their audio clock, which is the master clock with far greater accuracy, anyway.

Yes, MIDI jitter can be compounded on the receiving end - but having a very tightly bound MIDI clock to the audio clock can negate a lot of those issues upstream in the first place, and that is precisely why you get a good audio interface that does this anyway.

(Disclaimer: have worked in pro audio product development for decades, have written drivers for exactly this use-case, and I have personally been in the trenches to fight the myths about Audio and MIDI jitter as a developer for a long time now..)


It’s called WML/WAP


I think we can do better than a 15x15 text window


WML/WAP got a bad rap I think, largely because of the way it was developed and imposed/introduced.

But it was not insane, and it represented a clarity of thought that then went missing for decades. Several things that were in WML are quite reminiscent of interactions designed in web components today.


Gopher today (and even more Gemini) can do almost anything Wap did but without being a dead platform.


Have you read the WML 1.x spec? Let alone WML 2.x which never really happened. It had much more interesting scope than Gemini does.

Gemini is not a good or sensible design. It's reactionary more than it is informed.


Instead of offloading batch computations to a proprietary cloud, it’s better to actually optimize the incredibly slow and unstable computational kernel.

In any case that’s not the happy path, Mathematica gets stuck in symbolic computations for ages. My FFT-based research in Mathematica slowed to a crawl, tens of minutes of waiting, even with 90% of the code compiled to binary. MATLAB finishes this task in milliseconds.


Universe could be probability based GoL simulation; basic Turing machine cannot handle that


Is this your pull request?


No, no... I know better than putting too much work into something before poking the core devs and seeing if it's something they'd be interested in.

If they don't want code written by a robot then what do I care? Mostly I wanted to see how well the daffy robots could work in an established code base and I chose one I was familiar with to experiment on and they were less than receptive so, their loss, I suppose...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: