Submitted by rrfe t3_122l9dz in explainlikeimfive
ExTrafficGuy t1_jdqw3tg wrote
Short answer is because they can. Apple has always used proprietary connectors to a degree, though they are by far the only computer company that did. Especially in the early days. It vendor locks peripherals, ensuring maximum compatibility, limits the type of devices, who can manufacture them, makes said manufacturers sign license agreements directly with them, and requires said manufacturers to pay royalties, again directly to Apple. Where as Apple would make no additional money using an open standard.
The Lightning port came about for a variety of reasons. Apple had been using the 30-pin Dock Connector, which was big and chunky. Designers wanted something more elegant, which would take up less internal volume inside the chassis. Remember this was at a time when people were demanding thinner devices, yet larger batteries. So more efficient use of internal space was key. (This is also allegedly why the headphone jack was removed, if you ignore that Apple had just bought Beats and was really pushing wireless cans.) The Dock Connector also had another problem. It could only be inserted one way. INELEGANT!.
The only alternative at the time was micro-USB, that most Android devices used. Which is an absolutely awful connector. It's fragile, would gum up with crud, and was non-reversable. So Apple decided they needed something similarly sized, that was both durable and reversable, so it didn't matter what orientation you plugged it in. Thus addressing one of the most common complaints about USB. The design would feature 8-pins, mirrored on both sides, for 16 in total. The socket would have the contacts along the outer ring, making for a solid, durable connection. Which was great for 2012. But things started to change that quickly rendered Lightning obsolete.
For one, Apple's specification used for Lightning only supported USB 2.0, which had a maximum speed of 480mbits/s. Which was fine for phones at the time. But USB 3.0 was already out, which ran up to 5,000mbits/s. So as time marched on, the port hamstrung certain devices like the iPad Pro. Then in around 2016, USB Type-C started becoming available. It largely addressed the same problems as the Lightning, being a durable, reversable connector. It also supported features like high current charging, and speeds up to 20gbits/s, later increased to 40. Which allowed mobile devices to be docked to high resolution displays, and used like a desktop, along with support for other high bandwidth peripherals. It was quickly adopted by the wider electronics industry into a universal data and charging standard. Even Apple was quick to incorporate it on their desktops and laptops. But curiously, they kept the proprietary connector for their mobile devices.
Over time, there has been increasing pressure for a universal charge standard, to limit e-waste and streamline things. Since people carry around a mix of devices, it makes little sense to have to lug around multiple charge cables. Apple stubbornly held onto Lightning for their phones, you know, due to those juicy royalties. But the EU has now forced their hand. Lightning was removed for USB-C from the last iPad last year, and will be from iPhones next. Though Apple is still talking about using software locks to limit peripherals these devices can use.
So why doesn't everyone use proprietary standards like Apple does? Well, over time there's been increasing pressure from consumers for universal ones. Plus for most other electronics manufacturers, the profits they'd get from selling accessories wouldn't be enough to justify the expenses of maintaining their own standard. So it's easier to use open ones. Apple only gets away with doing it in this day and age because they they're massive, and have an equally massive ecosystem surrounding their devices.
Viewing a single comment thread. View all comments