I think the point is that the built-in ports on your computer might support any combination of a large number of possible features, and that it is difficult/expensive to make a hub that supports all of them, and people would hate a hub that happens to not support one particular feature that their computer’s built-in ports support.
It seems we are pretty well settled on the fact that on a good laptop, all usb C ports should be capable of all features. And then on a hub, you can expect the usb-c port to be charging pass through and nothing more. All other ports on the hub work as you would expect.
I wouldn't ever expect to see a hub where the usb-c ports pass on the full thunderbolt capabilities.
> I wouldn't ever expect to see a hub where the usb-c ports pass on the full thunderbolt capabilities.
Thunderbolt is specifically designed to permit daisy chaining (unlike USB). So it's actually quite normal for a device to pass Thunderbolt through to another, as well themselves being a Thunderbolt device.
> you can plug USB-A-to-C cables into this hub, and it'd be exactly the same as plugging USB-C-to-C cables into a hypothetical USB-C-replicating hub
I.e., anything you can plug into a USB-C port using a C-to-C cable, you can plug into a USB-A port using an A-to-C cable. So if the problem is idiots thinking "if it connects, it should work" — then wouldn't these same idiots be complaining that their Thunderbolt peripherals, external USB-C monitors, etc. don't work when plugged into one of these simple C-to-A hubs via an A-to-C cable?
But that doesn't seem to stop the C-to-A hubs from existing.
It’s not at all unreasonable for consumers to want a USB hub to function like “just give my computer more ports.” So no, it wouldn’t be unreasonable or idiotic to expect something that works plugged directly into your computer to also work when plugged into your computer via a hub.
It is less reasonable for a consumer to expect that any chain of adapters or cables that uses multiple totally different types/generations of connector would still support every feature of the computer’s built-in port.
> idiots thinking "if it connects, it should work"
You mean thinking things will work in the reasonable way that they have for decades prior to USB-C?
"If it connects, it should work" used to be largely true (modulo OS compatibility, which was easy to check by users, and wasn't even an issue anyway if you had a reasonably recent Windows version). It's hard to blame the users for not understanding why things have been suddenly made much more complicated by a standard that supposedly simplifies things.
100% agreed, and this is a royal problem with USB-C.
1. Pre-USB: Several different cords, with different shape ports to distinguish them.
2. USB-A: Several identical cords, with identical ports that function the same.
3. USB-C: Several different cords, with visually identical ports and no clear way to distinguish them.
It's like the designers of USB-C saw the success of USB-A and thought it was due to minimizing the number of shapes, rather than minimizing the number of factors to be tracked. When devices come with pleas to only use the provided charger and cord, rather than any cord that meets the standard, you know that something has gone horribly wrong.
Exactly. The fact this thread exists and everyone has a pretty informed, yet divergent point of view speaks to the problem.
Imagine being a consumer product marketing person trying to sell a device that meets the needs of a MacBook Pro user, a Samsung Galaxy user and a 6th grader with a Chromebook.
Egads, yeah. There's so many use cases that no standard can accomplish all of them. USB-C tries, and even it failed to cover some pretty big use cases. If I were to try to list out qualities that a USB-C port could have, I can come up with several different features.
0. Base level. No significant power draw in either direction. No significant data flow in either direction. I'll label this as the USB-A solution.
1. Significant power draw, recipient.
2. Significant power draw, provider.
4. Significant data throughput, recipient.
8. Significant data throughput, provider.
Between these, there's 16 different combinations, and pretty much all of them exist. As an exercise, I tried to see if I could come up with reasonable examples for every one of them.
0. Every single USB-A device.
1. Phone charger, typical use.
2. Wall wart for a phone charger.
3. Rechargeable battery.
4. Monitor with dedicated power supply
5. Monitor without dedicated power supply
6. Charge a phone while playing video from it.
7. External tablet, currently in use streaming video from a Raspberry Pi, Raspberry Pi powered via the tablet.
8. Desktop graphics card.
9. External graphics card.
10. Desktop Graphics card powering a single-cord monitor.
11. External graphics card powering a single-cord monitor.
12. High-speed external storage.
13. High-speed external storage without a dedicated power supply.
14. First in a daisy-chained series of monitors, with a dedicated power supply.
15. USB-C hub
Which comes back to why there are no decent USB-C hubs, because every single port would be expected to be effective at every single task. Which is ludicrous, and strikes me as why there are no good USB-C hubs, because it's a Herculean task.
USB-A did a lot of things incredibly well (non-streaming data transfer, peripheral management), and a lot of things relatively poorly (device charging, video display). USB-C tried to improve on the aspects that USB-A was bad at, and made an absolute mess of the things that USB-A was good at.
The advantage of USB-C, if you don't realize it, is that there are many devices that want to be clients sometimes, and hosts other times. You know, like being able to plug your phone into your computer, or your gamepad into your phone.
Previous to USB-C, USB ports were either "host ports" or "client ports" (thus all the different shapes); and devices were required to assume that if something was plugged into a client port of theirs, it was a host; and if something was plugged into a host port of theirs, it was a client. No dynamic host/client negotiation — there wasn't even a protocol for it.
Instead, there was "USB On The Go" (USB-OTG), which would signal a "USB-OTG" compatible device on its normally-a-client port, to switch that port+USB controller into host mode, based on the physical wiring of the host side of the special "client but actually a host" connector on a special USB-OTG cable. And, of course, users would never consider this problem ahead of time — why would they? — so there was no demand for host/client hybrid devices to ship with USB-OTG cables; instead, they were something you'd realize you'd need only too late, once you already had the problem; and now you're running to an electronics store at 8PM so you can plug your Android tablet into a scanner.
With USB-C, all that is gone. USB-C devices are required to signal whether they're a host, client, or hybrid device, so that hybrid devices can auto-discover whether it's a host, client, or another hybrid device on the other end of the line, and do the sensible thing in each case. No need for special "override" cables. And no need for discrete host-side vs. client-side connectors. Host-ness vs. client-ness is now a logical part of the USB protocol, rather than an electrical part of the protocol.
This also means that if you have, say, a laptop, you only need to bring one (good, beefy) USB-C cable with you — probably the one that came with your laptop's USB-C AC adapter; and this cable will both charge your laptop from the wall, and allow you to plug in any USB-C devices you might need to connect to. Or, in the case of a display or hub with power-delivery — both! (IMHO the point of the original Macbook's single-port USB-C design was to serve as a forcing function to make people realize that the combination of battery life + USB-C "switch-around-ability" means they can — in most casual use-cases — just alternate between charging and connectivity through a single port+cable, rather than needing both at once and thus needing two ports + two cables.)
> “If it connects, it should work" used to be largely true
Have you ever worked with DB-9 or DB-25 serial cables? If you had a cable that was wired correctly for the connection you wanted to make, it should indeed work, but you still would have to figure out baud rate, number of stop bits, and parity. Detecting what kind of null-modem cable you had basically came down to getting out your multimeter.
On PCs, you also had the traditional parallel port (mostly unidirectional, with IIRC two bits of information that a printer could send back: “don’t send more data; I’m busy handling the earlier data”, and “out of paper”) versus the much faster, bidirectional iteration that was used to connect such things as hard drives and CD players.
> "If it connects, it should work" used to be largely true
Gods no! There are innumerable devices with D-sub, mini-DIN, RJ-11, RCA, or TRS connectors, that put those connectors to entirely different and incompatible purposes. With entirely-incompatible electrical standards, even, such that plugging client-devices intended for one purpose into host-devices intended for a different purpose will electrically short one or both.
Remember the iPod Shuffle's "USB-A to TRRS" cable? Guess what happens when you try to use that cable to, say, plug your computer into your hi-fi? Or — less absurdly — into your TI-84 calculator?
The "Universal" in USB is in contrast to the DB-9 serial port, which was decidedly not universal. Yes, there were actual "serial ports" and "serial port devices" — the spec is called EIA/TIA 232, if you're curious — but there were tons of other things that used DB-9 connectors (male or female) but weren't EIA/TIA 232 devices. Plug one of those into a serial port, and poof! — there goes the magic smoke.
You don't have to worry about that with USB (including USB-C); USB ports and cables all follow a single unified electrical pin-out specification, whatever they're carrying. Even USB-C devices that mis-implement features like USB-PD (e.g. the Nintendo Switch) won't ever start an electrical fire, because — whatever logic-level protocol support they might have mis-implemented — they're still "USB-standard devices" on the electrical-pinout level.
Now, these are all fairly-old connectors I'm mentioning, so you'd think we'd have learned better by now; but special mention goes to the Atari joystick port (technically a type of D-sub connector), which is still to this day slapped onto arbitrary generic Shenzhen-special "TV plug-and-play consoles" — with no two devices being wired the same way, and so only actually supporting the gamepads the console comes with.
When you think about it, relative to this disaster, USB-C does guarantee you that 1. plugging any two USB-C ports together with a USB-C cable won't throw your circuit breaker; and 2. as long as both devices are data devices, you'll get at least USB2 data-transfer out of the arrangement (if you're plugging in a legacy USB2 device), and much more likely USB3 speeds. Everything is actually compatible with everything, as long as you don't care about getting the fastest line-rates possible.