Hacker Newsnew | past | comments | ask | show | jobs | submit | more bcaa7f3a8bbc's commentslogin

> from 1930

Good news: It's very likely that the copyright has expired. If you were to scan them, remember to upload them to archive.org for everyone else to see.

Bad news: It's only the case if the copyright hasn't been renewed by the owner. Usually most owners don't renew them, but to determine whether or not this is the case, you need to go through huge catalogs of registered entries from the U.S. copyright office.


Incorrect. It is almost certainly still in copyright in the United States. Anything published after 1925 will be in copyright except for those published without notice 1926-1977 or were not renewed 1926-1963. The exceptions almost certainly don't apply to NatGeo.

Scanning typically falls under fair use, so copyright only applies to distribution of the scans.

edit: Maybe you edited or maybe I'm just dumb. Anyway, the problem with relying on a lack of renewal is that you have to prove a negative. NYPL among others have been doing interesting work on this problem: https://www.nypl.org/blog/2019/09/01/historical-copyright-re...


Under US law, between 1925 and 1964, it was necessary to renew copyright every 28 years.[1] There are many works which have fallen out of copyright for nonrenewal, and there are projects which are now going through copyright renewal records to determine which of these are in fact now public domain in the US. Roughly 3/4 of potentially copyrighted works (published since 1924 and initially registered) have proved to be public domain.

Once copyright has lapsed, it cannot be reinstated.

https://www.crummy.com/2019/07/22/0

https://www.nypl.org/blog/2019/05/31/us-copyright-history-19...

________________________________

Notes:

1. Technically, the obligation existed until the 1976 act, at which time copyright status was automatic, but was retroactively waived for works never having lapsed, back to 1964, in the 1992 act. It's complicated. (NYPL link above.)


I bet many people will say that today's circuit boards are so cheap and making one at home is just too troublesome - and these days you can't make a good digital system on a two-layer circuit boards without a ground plane and controlled impedance, so why bother? And the usual reply is that you can get a board immediately without waiting, and for low-speed analog circuits or small microcontrollers, two-layer boards are usually adequate.

I believe while both are true, there's also an important application that is not always mentioned - RF prototypes. It's still very expensive to buy a circuit board made of specialized low-loss RF laminate, such as the Rogers series. But raw boards can be purchased at a reasonable price. Having a CNC/milling machine can be extremely useful to prototype RF planar circuits in the GHz realm.


> I bet many people will say that today's circuit boards are so cheap and making one at home is just too troublesome

I've heard this a lot and can never understand it. It's like saying "why use a 3D printer when you can get the same thing for $50 or $10 and a month of waiting?". Because sometimes I want a good-enough thing for cheap now, not a perfect thing for expensive later.


The reason is that ordering PCBs take ~2 weeks and cost 6$, where any time invested in making them yourself is more expensive. Additionally a 3D printer is much less of a hassle to use (basically level the bed, start the print and wait) than etching PCBs.

The only reason for doing them yourself are (if you already have all the required things):

- Getting them really quick (~hours)

- Getting them somewhat quick (saving on shipping, around 30$)

- Wanting to

With ~everyone using smd components today, soldermask is also pretty much a must-have which makes doing it yourself even more annoying.

In comparison to 15 years ago, where doing it yourself was the only option, it's really understandable that a lot of people just don't anymore.


Yes, exactly. I always want them really quick, because I'm in the middle of a hacky build I'll always make one of. Same for ~all my maker friends, it sucks to leave a build right in the height of motivation to wait for PCBs, so most of them use perfboards or similar ugly hacks. CNCed PCBs would be a great improvement.


I had a buddy try to etch a pc board once. He used a disposeable aluminum tray to do the etching in. Can you guess how it went?


I'd argue completely differently: PCB assembly shops are usually unwilling to work with externally-produced PCBs, so there's no point in making the PCB yourself.

Unless, of course, you are really good at soldering. But for me, 0.5mm pitch ICs or 0.4mm pitch connectors are way out of my motor skill league.

Of course, at that level, it's mostly down to using a pick&place robot, which could easily be shared by multiple people. So I imagine that in a few years, there'll be P&P centers where you can rent such a robot by the hour, similar to how there are 3D printing communities nowadays.


> PCB assembly shops are usually unwilling to work with externally-produced PCBs, so there's no point in making the PCB yourself.

I think these PCB processes and machines are all really designed for prototypes and experiments, it's the aspect that they're great for. If you find that you need to send them to an external PCB assembly shop, you probably shouldn't make your own board to begin with. Same for externally-produced professional PCBs - I order a raw board rather than a fully-assembled prototype because my chip is an uncommon part and they don't offer this chip for prototype assembly.

> Unless, of course, you are really good at soldering. But for me, 0.5mm pitch ICs or 0.4mm pitch connectors are way out of my motor skill league.

Speaking of soldering, I have no problem with 0.5mm pitch LQFP ICs with a good stereo microscope - for me I can just use brute force. However, my own problem is 0.5mm QFN - I have to use stencil printing and reflow soldering since I'm not good enough to hand solder that. I find a high quality board with accurate solder mask between the pin (no bridging), and with ENIG surface finishing (maximum flatness) are extremely helpful. I don't think a simple DIY PCB can handle these applications (but I'd be glad to find otherwise). But again, this was a 1 Gbps+ board.

Conclusion: I still believe DIY PCBs have their places for prototypes and experiments, but if one argues it's not useful because it cannot be assembled by a PCB shop or it cannot reach 1 Gbps, it would be demanding too much and not really fair for these simple boards.


Not everywhere you can get a board done in 8 hours...

There are places in the world where even DHL Sameday needs 1 month to get.


It seems that even a megacorp like McDonald needs the right to repair...

On second thought, not exactly the megacorp, but the franchise owners.


McDonald's the corp isn't hurt by this, individual franchise owners bear the cost. In particular, mcdonald's the corp mandates the particular icecream machine that is used, rather than allowing the franchise owner to pick from several machines. The video argues that this is done because mcdonald's is interested in benefiting the company that makes and services the ice cream machines.


Given the meme is already circulating among the youth with McDonalds ice cream being the bottom of the joke, and given it reached the top of HN many times already, I say this hurts McDonalds image a lot.

When people think about “broken like a McDonalds ice cream machine”, I think McD can legitimately ask Taylor for damages at corporate level.


On the face of it, McDonald's corp should also be harmed by this. It makes their customers less successful through higher costs & lower sales, and it tarnishes their brand when a customer of their customers is unable to purchase an ice cream.

The two questions in my mind after watching:

1. What makes McDonald's corp okay with it?

2. What is the difference between the arrangement with McDonald's and the other franchises/chains? (Wendy's, In'n'Out, etc.)


McDonald's Corp owns are significant number of restaurants directly. Most are franchises, but not all. Thus the company is directly hurt.

This feels more like someone not in the know of the real problems finding a false complaint that seems worse than it is.


I’ve seen franchisors collect kick backs from vendors for being an “approved vendor” the franchisee has to use.


Pretty sure McDonald's is fully aware of this and deserves a large portion of the blame.


I always believe allowing any forms of JIT (in userspace programs) by default creates unnecessary security risks - only a handful of programs use it. The sysadmin should have a choice to enforce strict W^X on all programs and to whitelist JIT-enabled applications only when needed.

The kernel space BPF JIT in particular, is an huge attack vector and has been repeatably exploited (or facilitated other exploits) in the past. If you don't need BPF JIT, I recommend disabling it. Sure, tcpdump or nftable may be slower, but often it doesn't matter, unless it's a cutting edge production system which actually relies on BPF JIT. It can either be removed completely when building the kernel, setting "net.core.bpf_jit_enable" in sysctl to disable it globally, or setting "kernel.unprivileged_bpf_disabled" to disable it for unprivileged programs.


I’ve gone with the KSPP suggestion:

# Turn off unprivileged eBPF access.

kernel.unprivileged_bpf_disabled = 1

# Turn on BPF JIT hardening, if the JIT is enabled.

net.core.bpf_jit_harden = 2

https://kernsec.org/wiki/index.php/Kernel_Self_Protection_Pr...


Not all JITs are WX. TurboFan in V8 is W^X.


PaX's implementation of W^X (MPROTECT) is stricter - you can disallow any attempt to introduce new executable code, including writing to memory first and giving it executable status later.


Fan fact: Shugart is today's Seagate, the name was changed because the founder, Alan Shugart Shugart, had two companies, one for HDDs (Shugart Technology) and another for FDDs (Shugart Associates), both named after him. Later the FDD company was sold to Xerox but kept the name, so the HDD company had to change its name to Seagate. The rename was quite clever, it still rhymes with the original word.


That pqRSA paper by DJB is a joke, but it's mathematically correct and an interesting thought experiment - RSA really becomes post-quantum when you use a 1 TB key because it outgrows what Shor's algorithm can scale at that point. The paper is actually better, it proposed an original algorithm, GEECM, that's faster than Shor's algorithm for numbers with many small factors, then also showed pqRSA is safe from GEECM. He even submitted the algorithm to the NIST Post-Quantum Competition for review (among his more practical algorithms like Classic McEliece), it's just hilarious.

> DJB yelling from the back of the room "How much RAM does the NIST benchmarking machine have??" Dustin Moody replying "Dan, we're not benchmarking pqRSA!"

https://crypto.stackexchange.com/questions/59591/why-is-pqrs...

Here's his explanation of the idea:

https://cr.yp.to/talks/2017.06.27/slides-djb-20170627-pqrsa-...


Given the number of Star Wars references in the various NIST Post-Quantum Competition scheme names (CRYSTALS-KYBER, SABER, NewHope) I'm rather sad he didn't call it "Post Quantum RSA - The Phantom Menace".


At least don't make it worse. Every time a new program uses the name "Z3", Konrad Zuse rolls in his grave. Using it for a single program shows homage and dedication, using it twice or more is excessive.


What does Z3 in itself even mean, so that so many people conclude it’s the righ name for their new project?

You cannot even register most of the like 5 chars long domain names, who would think 2 chars will be unique?


Yes, all HIGH severity vulnerabilities (even MODERATE) have lead time.

e.g. see https://mta.openssl.org/pipermail/openssl-announce/2021-Febr...


I really like the lead time. It gives everyone time to prepare to patch ASAP when it is released without having the vulnerability available to attackers through diffing the binary.

On the other hand, it gives people that found the vulnerability independently a bit of extra time to exploit.


> It gives everyone time to prepare to patch ASAP when it is released without having the vulnerability available to attackers through diffing the binary.

It’s an open source library so the code of the patch will be available as soon as it’s published. It’s not released as a compiled library.


Yes, indeed in the case of OpenSSL indeed but I was referring to applying the practice in general. With an open source product the lead time is even more important. Even then, they could release the binaries from a private branch/repository so that packages can be updated before the source is released.


Imagine if computer engineers have a religion of burning down and rebuilding the entire computer architecture every 20 years, hardware and software, from scratch. Everything will be cleaner, and the problem raised by Jonathan Blow [0] can be addressed too (it's basically: modern system has low understandably & maintainability, everything is extremely complex, and only a few people in the world can understand systems at each low-level component, it only takes a moderate social disruption for the entire digital civilization to collapse").

[0] https://news.ycombinator.com/item?id=25788317


There is an interesting discussion going on in the recent Microsoft Exchange hack thread about firm's periodically rebuilding their IT infrastructures to protect against malware that shows some convergent thinking that might interest you: https://news.ycombinator.com/item?id=26367534

I like your idea. What do you think would be the first step to getting there?

I have a client in the controls systems space, a systems integrator that helps municipalities update the systems that keep the water running, and one concern that occurred to me while investigating their business requirements is how the systems they work on are becoming less and less able to recover from certain attacks as the components that make them work become purely digital and electronically controlled. Perhaps we should begin engraving the contents of wikipedia on clay tablets, just in case.


Elliptic Curve Cryptography still has an excellent theoretical security record, and will likely keep its record until the advent of quantum computers. Also, trustworthy digital signature is actually not too difficult even given the doomsday scenario - if all public-key cryptosystems are broken, as long as you still have a secure hash function, you can use Merkle signature [0].

[0] https://en.wikipedia.org/wiki/Merkle_signature_scheme


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: