> The only one I've seen is the gartner snake but here in Edmonton it is just a little bigger than an earthworm.
That might just be because Edmonton has lots of enormous dew worms :)
There are bigger garter snakes around - I live just outside Edmonton and see them pretty frequently. I hear there are plenty of rattlesnakes in southern Alberta too.
Note that this "strength was greatest at noon" was also said to be true about Gawaine, and there is a version of a combat where after the noon hour his strength waxed and he was defeated.
"According to the Vulgate Mort Artu, Gawain had been baptised as an infant by a miracle-working holy man, also named Gawain, who named the boy after himself, and the following day announced that every day at noon, at the hour of the baptism, his power and strength will increase."
You can do a lot worse than ACH. It's hard to read, but it's simple and pretty well-defined.
What _really_ sucks is one-off fixed-width formats that aren't well defined, or that change suddenly (oh, you thought that field would always be populated? lol no.)
Yeah, ACH is surprisingly not-unpleasant, at least relative to nightmares like X12 EDI with its implicit looping constructs and billions of companion guides that supercede random parts of the base spec.
X12 EDI, been there earned the badge, both on the generating and receiving sides. 837, 834, 835 and others in health care - fun! Positional format, with situational meaning... really it is a fascinating format.
i wrote the X12 EDI stuff for a major clearwater distributor... also wrote the ACH stuff that dumped someone a file and their job was to upload it via SFTP to the bank. so many badges.
What _really really_ sucks is well-defined, reasonable, meticulously documented formats where all in-the-wild implementations stray from the spec in different ways :-/
Even calling it archaic is too harsh. Granted, the batch-centric nature is not ideal (and it's hard to imagine a system with this kind of latency built in being designed today), but if you're designing a system based on batch processing then shipping files over SFTP is a pretty reasonable way to do it.
It's very easy to get stuck inside the developer bubble, where 'archaic' means 'something that was state-of-the-art four years ago (or in the web/design world, 3.5 minutes ago)'.
SFTP is a nice technology which I make quite frequent use of in my job and at home. The idea that I plug in my Yubikey, provide a passphrase to unlock the key, authenticate to my server which that's got a verifiable certificate installed, and the private key never leaves my Yubikey is about as state-of-the-art as I could ever ask for security-wise.
Granted, precisely how the SFTP site is secured isn't specified and there's plenty of ways to do that wrong, but as a technology, it's always impressed me how seamlessly it works once it's setup.
SFTP tends to be the go-to method of integrating software between organizations, especially ones that aren't used to building actual APIs. It's standardized, everyone knows how to use it, and it essentially boils down to: generate a file and upload it to a server. There's all sorts of problems with it, and it tends to be a pretty short sighted decision.
The biggest problem I've found is that there's no real feedback loop (for this class of integration, not sftp specific). Company A uploads a file -- often in the middle of the night -- and Company B processes that file some time later. Could be a minute, could be a day. Company B fails to process the file, possibly because of a change on their end, possibly because of a change on Company A's end. There's usually a bunch of back and forth, but ultimately the earliest you can determine if a fix works is a full 24 hours after the file was initially dropped. Both companies could create integration environments with shorter feedback loops to help, but ultimately its a problem of not getting an immediate response that the integration is working.
95% of my job involves writing software to integrate systems from different companies, and the integrations that involve SFTP are almost always the biggest pains. I've thought about writing an SFTP server that operated in real time, and either rejected bad files or had some other way of responding to the inputs. Never really found the time to do it, and ultimately it'd be patching a process that should have just been a well designed API from the start.
It's tricky, though people are using them. Mastodon probably is seeing the most success at the moment. Over half a million users by current estimates. Not everyone who signs up stays around, but there's still quite a bit of retention. We could do better.
We need to make these things easier to host and deploy though. Having worked on MediaGoblin for many years now, I've found the most depressing part of it is that not only is it hard for people to start hosting things, it's even harder for them to keep it running... and it's not just MediaGoblin; you'll find this of most social network things. Most especially, people become afraid to upgrade their systems. I'm hopeful that systems like Guix will help in that regard, but that's a whole direction to explore itself: https://media.libreplanet.org/u/libreplanet/m/solving-the-de...
Yes. This seems to be a classic two-sided market problem. Developers won't flock to it unless there is a critical mass of users. Users won't flock to it until there are widely used and understood tools that are better than they have now.
In addition to what everyone else said there are big regional effects: in North America cardholders have often never seen 3D Secure before, they don't know their passwords, and issuers don't care enough to make their authentication pages usable.
But in France and the UK it's pretty standard; cardholders are used to it, and issuers make an effort to decide whether it's worth requiring authentication.
I suspect the "double lightning bolt" is just a glyph to print for end of line, for CR or LF. A tape printer doesn't do anything special for CR or LF, but some of them print something. One of my machines prints "=", and one prints an oversized comma. There's no standard for that.
> The unstated assumption here is that DRM that isn't a standard won't be built into a browser.
That's not the assumption (because DRM already existed in the browser before it became a web standard - remember Silverlight?).
But it's much less work for the entities providing the DRM when EME provides a common standard for them to work with. This lowers the pain (for them) of using DRM, which makes it harder to provide enough pressure on them to stop DRM.
EME commoditizes DRM, for better or worse. Like you said, a EME lowers the pain to adopt DRM. With a standard EME API and CENC (Common Encryption), services can easily support multiple DRM backends or switch DRM backends. I worry that increased competition between DRM providers will lead to an arms race of stronger DRM tech.
EME Standardized the Plugin err "extension" API. It in no way standardizes DRM. The Stardardization is around the way Javascript will be used to call inside HTML5 the browsers CDM (content decryption module" which is a plugin by another name.
There are currently 3 competing technologies, with more to come, that are incompatible with FOSS, incompatible with open systems
For Chrome Browsers there is Google Widevine CDM
For MS Browsers there is MS PlayReady
For Firefox there is Adobe Video CDM (which is basiclly the video playback part of Flash)
This is the problem with EME, Netflix, Google, and Microsoft have been masterful at their marketing of this to people that should be able to see past the bullshit
EME/CDM is still a binary browser plugin, Sure we "eliminate" flash and the need for a "3rd party" plugin, but in many ways that it worse, Before we had a standard Plugin API that allowed non-supported browser to make use of Flash, so there where many many many browser that could call the 3rd party flash application to make use of that content. Now those browsers are completely locked out. You will only be able to Access content on the Big 2, Chrome and IE, and maybe Firefox is they finalize their agreement with Adobe to bring in a Binary Blob into the Firefox Browser
It's worse than that. Competition drives price to zero. That means DRM providers need to get their revenue from some other way. The obvious way is to exploit (track, or worse) the user.
So if these businesses did earn money from DRM, you think they'd go "some extra profit by adding user tracking? Oh, no, thanks, we're good" and leave it on the table? Why?
That might just be because Edmonton has lots of enormous dew worms :)
There are bigger garter snakes around - I live just outside Edmonton and see them pretty frequently. I hear there are plenty of rattlesnakes in southern Alberta too.