Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Python's standard library is where things go to die because of the terrible adhoc versioning system (the module name is the version number) and dynamic typing means they are afraid to change anything. But even then it's still better than having no standard library at all.

The advantage of a standard library is that you only need to learn one API instead of a dozen different APIs for doing the same thing, which means you can develop a degree of mastery over it. It also reduces the friction for using better abstractions. e.g. Every professional Python programmer knows defaultdict, whereas I rarely see that data structure used in other programming languages, it's too much of a leap to install a dependency to save a few if statements, but it all adds up.



> The advantage of a standard library is that you only need to learn one API instead of a dozen different APIs for doing the same thing, which means you can develop a degree of mastery over it.

The rust ecosystem has done well to converge on certain crates as sort of replacement for missing std features.

In practice (at least in the rust ecosystem), I only need to learn one interface for:

* regex (regex)

* serialization (serde)

* network requests (request)

There are de-facto base crates in the ecosystem.


As a relative outsider, it’s not obvious at all that these are the right crates to choose. I appreciate the commitment to long-term stability that the standard library appears to have, but that benefit goes out the window if I accidentally rely on a third-party crate that changes its API every six months.

Looking at crates.io, regex looks pretty safe, as it’s authored by “The Rust Project Developers” and includes explicit future compatibility policies. Unfortunately, I can’t find an index of only the crates maintained by the Rust team.

Serde is obviously popular, but at first glance is a giant Swiss Army knife that will likely have lots of updates to keep track of that are completely unrelated to my project (whatever it is). If I search for JSON, I get an exact match result of the json crate, followed by a bunch of serde-adjacent crates, but not serde itself.

Request hasn’t been updated in 4 years, and has a total of less than 7000 downloads.


They probably meant reqwest (https://github.com/seanmonstar/reqwest), not request. Reqwest is maintained by the same developer (seanmonstar) as hyper, the de facto standard http library.


Ah, I did. I posted that on my phone. Autocorrect strikes again.


Because it's "reqwest", not "request".

All these libraries are very well known within the community and are what I would come up with as a complete outsider (I don't think I've written more than a hundred lines of Rust code to this date).

You can also find some pointers here:

https://github.com/rust-unofficial/awesome-rust

https://lib.rs


There's actually a more official resource: the rust cookbook[0]. This is maintained by the rust-lang team (rust-lang-nursery is an official place for crates maintained by the rust language maintainers).

[0]: https://rust-lang-nursery.github.io/rust-cookbook/


That sounds like something that could be solved by having crates.io provide a curated list of common popular crates for certain features. That is, this seems to be mostly a documentation issue.


This was attempted by the Libz Blitz back in 2017. See the rust-cookbook: https://rust-lang-nursery.github.io/rust-cookbook/ .

I agree that this should be better documented and probably more integrated with crates.io somehow.


It’s really a reputation bootstrapping problem, for which popularity can be a useful proxy. For me to use third-party code, I have to trust that the future behavior of the developers will be reasonable: I want my side projects that don’t get touched for months or years to still mostly work when I get back around to them.

Not everyone or every project will have the same desires, though. Sometimes, a fast-moving experimental library is the right choice. The trouble is figuring out which I’m looking at.


I'm not sure I follow these concerns about "working in the future" - as long as you specify versions that work for you in your Cargo.toml file, that should work at any point in the future given that you use Rust 1.x.

If you'll want to update to always be on the latest version of each crate, well that discomfort about them potentially not working is part of the price.


If I come back to something, it’s because I want to resume active development. Keeping a dependency pinned at an old version makes that more difficult in various ways, so I personally value forward compatibility.

Not everyone does, and that’s fine. I just want to know what a library developer’s stance on it is before I try to use their library.


Anyone can search for "awesome [lang]" when they want such a thing.


Do they also compile and run across all platforms supported by rust compiler?

Because that is the biggest asset from stuff being in the standard library.


No, but not all parts of std run on all supported platforms.

To clarify, there are different levels of support on different platforms.


Then it already starts with a failure of quality gate of what goes into std.


Not really. Rust supports 8 bit microcontrollers. Lot's of libraries, including parts of the standard library make no sense on these kind of platforms.

The standard library and 3rd party crates generally have excellent compatibility across mainstream platforms.


So do Basic, Ada, Pascal, C and C++ as well, and yet they have richer standard libraries, with deployment profiles.


Since when does libc include "rich" things like regex, serialization and http?? Even C++ standard lib is mostly containers and algorithms.

libc is barely a standard library, it's almost nothing.


C might not include those specific examples, but POSIX + libc already include plenty of stuff.

C++ surely does include regex, with serialisation and http scheduled for C++23, or with luck with a TR as intermediate delivery.

Although serialisation depends on static reflection being finalized as well.


Still, it implies C fails, because you don't get POSIX on all available platforms.

I think the definition is meaningless. Rust spends huge amount of resources to test itself and its surrounding ecosystem on tier 1 platforms.

I'm fairly certain regex from c++ std will run like crap, if at all on something with 4MB of RAM.


Except the small detail that C's POSIX support is much wider than Rust's tier 1 platforms.

I am fairly certain that without profiling and defining a test configuration for a set of specific C++ compilers / standard C++ library I won't assert anything about std::regex performance with 4 MB of RAM.


Perhaps. But you're comparing a 26 year old standard for a 47 year old language, to a language that's been stable for 4 years.


>deployment profiles

In other words, not all functions in the stdlib of Ada, Pascal, C and C++ can be used in all possible target environments? Sounds like a failure to quality gate those standard libraries.


Nice jab try there, the suble point you are missing is that a deployment profile still is a guarantee of support.

There aren't an endless number of profiles.


I'm not sure if you're just being disingenuous here, but you're right that you're not going to be able to use all functionality from the stdlib of Ada ( and others ) on every possible target, but you were never, ever going to. And Rust certainly won't solve this problem for you. It's not a consequence of poor standard library design either. It might not be immediately obvious, but even C has a runtime library, which needs to be specific to the architecture and OS that you're targeting. Just for a quick example, `malloc` is going to need to function differently depending on what OS you're running, and if you're targeting a microcontroller with extremely limited RAM it might not even need to be implemented at all.


I don't think the parent was claiming the Rust was better in this regard, just that it was no worse. Other languages also restrict standard library features on some platforms.


Not exactly. Rust can be made to run on 16-bit toaster or even OsIJustWrote. Just because it can be run, doesn't mean Rust std lib devs will support 16-bit toasters or OsIJustWrote.

Each platform has different levels of support. Primary being Windows, Mac and Linux, where every pure Rust crate runs.

Std lib makes certain reasonable assumptions, for which it works e.g. malloc exists and panic! is implenented.


In Python, if you're running 3.7.1 then you're also running the standard library for 3.7.1. Sure, I guess it would be possible for a programming language to decouple these things so that it's possible to ask for a particular version of the standard library (in its entirety), or a particular version of a standard library... but then programmers can no longer rely on the standard library to "just work" and "just be there", which is its appeal. If you decouple the standard library from the language, might as well switch to a Rust-like system where you simply give an official stamp of approval to certain packages regardless of who developed them.


I think you might have misunderstood me. I was contrasting one extreme interpretation with another. I was not really criticizing Python. Its large standard library is one of the things I like about it.


Java 8 saw Map obtain the method computeIfAbsent, it saves a bunch of boilerplate, just like defaultdict.

https://docs.oracle.com/javase/8/docs/api/java/util/Map.html...


> dynamic typing means they are afraid to change anything

When talking of the standard library, static typing doesn't save you when breakage happens. It's better of course, at least the compiler protects you from obvious errors (although that doesn't work for transitive dependencies, with the dynamic linking to binaries that Java / the JVM does ;-))

The problem is when a piece of code that was compiling fine a year ago, fails to compile on a newer version of the standard library, due to breaking changes, that's going to take time and effort to fix.

And this gets worse when the breakage happens in dependencies and those dependencies are no longer maintained. This can always happen of course, not just due to the standard library, but due to transitive dependencies too. But still, breakage in the standard library, or in libraries that people depend on, is a bad thing. And consider that as the number of dependencies grows, so does the probability for having dependencies that are incompatible with one another (compiled against different versions of the same dependencies).

And semantic versioning doesn't work. Breaking compatibility will inflict pain on your downstream users, no matter how many processes you have in place for communicating it. And this is especially painful when you're talking about the standard library.

If the standard library introduces breaking changes, regardless if the language is static or dynamic, then it's not a standard library that you can trust. Period.

Also — when should you break compatibility, in the standard library or in any other library?

The answer should be never!. When we want to change things, we should change the namespace and thus publish an entirely new library that can be used alongside the old one. Unfortunately this isn't a widely held viewed, but I wish it was.

---

Going back to the batteries included aspect of some standard libraries, like that of Python, there's one effect that I don't like and that's not very visible in Python since the bar is pretty low there.

The standard library actively discourages alternatives.

When a piece of functionality from the standard library is good enough, it's going to discourage alternatives from the ecosystem that could be much better.

Some pieces of functionality definitely deserve to be "standard". Collections for example, yes, should be standard, because libraries communicate between themselves via collections. And that's what the primary purpose of a standard library is ... interoperability. Anything else is a liability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: