Actually the EU is forcing standardization of electronic devices.
USB-C2
2 C, 2 USB
USB-C2(1)_new_final
USB 2x3 v5 2.2+ 3.1
At least usb-a3 works with previous ports
USB-C2 is four fold symmetric
USB-D will be round again
Most devices are still on USB 3.1, so there is a room for growth.
That being said, newest USB protocol supports 240w charging and 20gbps transfer rates. It’s good even for next generation laptops, not even talking about phones
I’d say a lot of devices are fine on USB2 or even 1
Some things don’t need much
that being said, there is no standard indicator for ports, chargers, and cables to signify what charging speed they support.
Sure, usb c can technically do 240W, but most people use crappy chinese cables which will do max 5W and blame it on the usb specification
I’d argue that they’re partially right (or at least not entirely wrong) to blame the specification. If the specification makes it easy for crappy manufactures to be crappy, then the specification probably should have planned for that in a better way. And crappy manufactures being crappy is a tale as old as manufacture. Yeah I know there are cable marking requirements, but clearly nobody gives a flying fuck. The USB IF has basically all of the power in this situation, and their members collectively control a significant percentage of the planets wealth, so it’s actually their problem to solve.
And crappy manufacturers being crappy is a tale as old as manufacture.
Ea Nasir catching hate still it seems.
If companies can stop cheaping out on their USB-C ports that’d be cool.
This could mean that OP has either +100 usb chargers, or a fraction of a non-USB-C charger
deleted by creator
I can only dream of FireWire’s return
Thunderbolt is already integrated in usb-c.
notably not the same thing at all
Is the wish to bring back an outdated communication protocol or an outdated connector?
the communication protocol doesn’t seem that outdated, and the only issue with the connector seems to be price and size
The protocol is 30 years old. It has been replaced 3 times over by thunderbold. The connector doesn’t fit on most modern devices and has been replaced by usb-c.
Why would anyone want it back?
the peripherals that use it are significantly cheaper now than their modern alternatives, and the cables go farther than USB 2.0 let alone 3.0 or thunderbolt. I like it for reasons similar to why I like VGA, namely price while still being good enough for modern usage.
Heh at my age (and growing up with computers since the 90s well earlier but I didn’t know cables well) I assume there’s a new one next time I blink. Also at my age I don’t realize I blink as often as I do. So just shrug buy the cables your devices need and not worry too much. Mean it sucks yeah, I got tons of USB cables I never use anymore, but it’s how it goes. Much slower than it used to at least so less issue to complain. If they ever settled on some port that’d work for over 10 years I’d prefer that of course.
And at least there’s now one(ish) standard instead of N+1
I wanted to check that caberQu the other guy is talking about in the comments…First time I see a Google search returning a result in Lemmy. Cool.
We did it! Ok, guys let’s start pumping out facts for future AI training data. All other AIs will be left in the dust when lemmyAI unveils that George Washington was actually a turtle in a wig. The people deserve to know the trusth!
It will constantly spout violent language
Goerge Washington is known for having wooden teeth, but while his false teeth appeared to be wood they were actually made from shards of turtle shell
A good one I’ve discovered while researching the architecture is to occasionally use words that are close to other words in semantic vector space, but are the wrong word exceed the context it’s used in. Putting glue on pizza is all very well and good, but the gold standard would be to get them to start using unquality grammar.
It would be betterest if we could organize this on a large coordinated scale. God help any AI that has been trained on any social media website. It’s just not good quality data a large percentage of the time.
deleted by creator
Everyone around the world is benefiting from the EU common charger law: https://commission.europa.eu/news-and-media/news/eu-common-charger-rules-power-all-your-devices-single-charger-2024-12-28_en
Dear Europe. Please take me in. Do you have any English speaking countries? Your laws seem to be geared towards benefiting people. Not tyrants and corporations.
Lucky for you, you can get around with English in most places.
Ireland didn’t leave the EU, so that’s an option.
In most big cities you can get around just fine. In some you can actually live very comfortably.
As far as laws go, as an EU citizen one is entitled to communication with any public institutions one may come across in their preferred “official language”. Stuff like paying your utility bills, registering health insurance, similar bureaucratic stuff, as well as getting stopped by the police. You can insist on doing it in any one of 28 languages, including English.
Usually that’s a bit overkill, and whoever you’re dealing with will be happy to speak to you in English or find someone else who does if they don’t. I assume the same goes for non-citizens. German and French are also quite popular, but English is by far the most ubiquitous.
Ireland speaks mostly English as far as I know.
The best way to learn a language is through immersion. Honestly I feel like it would be a lot of fun to learn a language in Europe since the majority of people also speak English well if you really need to fall back to that.
Ireland, but housing is shite right now.
Is housing shit because the homes need repair? Or are they shit because a single room shack is owned by corporate interests and costs 8 billion dollars a month?
housing is shit because there is no houses
Oh, thats ok. I can just live with you. I’ll sleep in the master bedroom. And you can sleep…somewhere, I imagine.
They did have one heavily English speaking country, but those guys peaced out a few years back. Now it’s just Ireland and Malta (where English is an official language).
And Cyprus.
Is English an official language there?
I was sure it was, but looking it up now, it seems it isn’t. In any way, Cyprus was a British colony, and so people here are very likely to speak English.
I think the Netherlands has the highest amount of L2 English speakers.
In the Netherlands, the English language can be spoken by the vast majority of the population, with estimates of English proficiency reaching 90%[1] to 97%[2] of the Dutch population.
https://en.wikipedia.org/wiki/English_language_in_the_Netherlands
It’s not the official language though so all documents and legal stuff would be in Dutch.
100% of Irish people can speak English and do so without sounding as ridiculous as the Dutch do.
Well kinda
Zeg makker
:(
Chin up, Nederlander. I don’t think you’re ridiculous.
:)
:(
Europeans from which country get upset when they hear their fellow countrypeople speak English poorly?
Was it Germans, because there’s compulsory English education in schools?
Ime, Germans love shitting on other Germans’ English skills. I’m an English (and German) speaking immigrant in Germany, and I honestly think most people do pretty well, but nobody here finds it as impressive as I do.
It’s not the official language though so all documents and legal stuff would be in Dutch.
Well, sorta.
If you’re an immigrant there, the Vreemdelingen Politie and other authorities specifically dealing with immigrants will send you the documention in English if you prefer.
Also banks will communicate with you in English if you want.
However, you can forget all about getting anything in English from, for example, the local authorities.
Mind you, it’s actually fun to learn Dutch IMHO, though I wouldn’t recommend reading official documentation as the best way to do it …
Any Scandinavian country should have a population ranging from proficient to fluent in English.
There’s good and bad. Every few months the EU tries to ban encryption without backdoors again for instance, because “oh dear, think of the children!”.
I’m moving to Sweden soon, just about everyone there speaks English! And also Swedish is such a a pretty language I’m really excited to be immersed in it
Can confirm, took me way too long to become fluent in Swedish because I just talked English with everyone 😅
I definitely recommend practicing the language though, it’s very important for social interactions, official stuff, and many careers.
Välkommen!
Tack ❤️ About how long did it take you to become fluent?
I’m definitely a big outlier, I was always pretty bad at foreign languages in school, and I was in a very english-heavy daily environment. I have social anxiety too so I just switch to English whenever I’m worried I’ll say something wrong.
I studied Swedish in an international gymnasium and then barely passed Svenska som andra språk III in Komvux during the first 3 years I lived in Sweden and I would say I was at a B1 level after that. I went to English-language university and worked in IT afterwards so I wasn’t speaking Swedish on a daily basis, just some jobs where we would have the occasional Swedish meeting or I would send some emails in Swedish. After 10 years though I got a Swedish-language government IT job and my Swedish has improved a ton in just a few months. Nowadays after 11 years I’m definitely a C1 or C2. I might trip up and sound foreign on some complex topics, and I definitely still have an American accent, but I basically speak like a native. But yeah, it is very rare to not be able to speak English with someone on the street, but of course, it is important to learn Swedish to make social environments, paperwork, and work easier.
I would say Swedish is probably the easiest foreign language to learn as an English speaker. The sounds are quite straightforward or can be approximated, the grammar is super simplified and nearly identical to English, and most of the vocabulary are cognates with English. A lot of words can be verbified or adjectified so the vocabulary comes quick. Both Swedish and English are germanic languages with tons of French loan words so the overlap is huge.
USB-C will be destroyed by Romulans and next we will have USB-D.
USB-C will be around for a long time, it’s a strong standard. Wireless inductive charging won’t take over for a long time because it’s limited in speed, and WiFi/Bluetooth are much slower for data transfer.
Should we tell them about usb d?
Is there any actual benefit for wireless charging? You still need to plug the charger somewhere and just feels like more expensive way that’s prone to more problems.
I am all for “research for the sake of research is enough and needs no further justification.” But I still feel like I am missing something here. Why are companies producing and selling it? Am I dumb?
Only scenario it seems useful is that you can replace your phone’s USB hardware with a small badUSB and rely on wireless charger while cops wonder why they can’t investigate your files on their device.
Convenience. Decor. It’s much easier to slap a phone on a charger. The chargers also look better than a cable laying around unplugged.
I have these battery packs that magnetically stick to the back of my phone and charge it. Just slap it on and forget about it.
It makes my phone hot and wastes a lot of power (I can also charge from the same battery packs using a cable, and I get noticeably more charge).
But it’s real convenient when you don’t want to worry about it. I use them at conventions or when I’m out hiking or skiing.
Same. In winter it doubles as a pocket heater. Summer is worse, I wish electronics could also feasibly convert waste heat to cooling, but physics be like “yea, nah”.
There are fans that attach to the backs of phones. Of course they use electricity as well.
Clipping a sterling engine and a radiator to the back of a phone could be fun.
There’s the regular wireless charging where you need to put the phone on exactly the right position. That one is totally useless, since it’s even less flexible than cable charging. The only upside is that you don’t need to physically insert the cable. That’s pretty much worthless.
There’s another setup that allows you to charge over a larger area, e.g. a whole desk. That is expensive and/or much work, since it needs to be integrated into the whole area (e.g. desk) and it’s incredibly wasteful in terms of energy consumption that doesn’t actually end up charging the phone.
The only real upside I can see of wireless charging is that you can use it if your USB C port is worn out and doesn’t work any more.
I guess from a consumer perspective, it can be more convenient (e.g. wireless charging in a car)
For me, I see it as a way to reduce wear on a charging port, or as an alternative if the port does fail.
I like it for the latter as I don’t like my devices to be inefficient but it makes me feel better that should the USB-C fail on my phone, it’s not game over for my phone.
I’ve had several phone where the USB socket stops working reliably. At that point it’s easier to use a wireless charger.
Yes, it’s usually pocket fluff in the socket and it can be picked out, but it takes some time and care to avoid damaging the socket.
My latest case (Otter) also has a cover that is awkward to open to plug in the lead, so there’s that too.
As a bonus the charger works with Apple and Android so very convenient as my kids are Macolytes.
Wireless charging is nice for when you’re using your phone infrequently, such as at your desk while you’re working on something else. It sits there charging, you grab it to respond to a message then set it back down. No tail to worry about, it’s not getting tangled on other wires when you dare to move your phone, etc.
It’s really a feature I never cared about until I got a wireless charger as a gift
USB-C is just the connector type, not a particular speed.
True, I appreciate the correction, the actual data transfer speed is determined by the USB version.
I will never forgive USB for the ridiculous naming
Agree, it’s a total trainwreck
Uuuhhhhh, copilot, is that you?!
480mbps is still faster than shitty cloud services
edit: yes I know about usb 1.0 and 1.1
USB 1.0 barely got any traction. I have never seen a device in the wild.
USB 1.1 exploded in use and was fantastic compared to the mess before. It was fast enough for most file sizes at the time.
USB 2.0 is still very usable today.
Idk about the wifi thing, my phone should technically be able to do >500 Mbps to my computer yet it still transfers files at like 10 over wifi or usb
500 would be more than good enough but 10 is not
(It’s a OnePlus 12, age is not the issue)
I would also dislike the loss but I don’t think data speed is really the issue. Mostly that I couldn’t connect peripherals like my flash drive or sd card anymore
take manufacturer’s claims
divide by 10
half it
half it again
you now have the max your device will ever reach, with the usual speeds being ~60% of that
(my isp says 300mbps, divide by 10, half, half, 7,5mbps, which i think i never saw since the speeds are actually from 3 to 4)
I can get like 300 Mbps on a speed test tho
That’s probably a problem with your router or receiving hardware btw unless you’ve confirmed otherwise
Especially if you’re in an area with a lot of other wifi signals or radio frequency interference
If it’s an ISP provided router you could probably ask for them to look at it
That’s probably a problem with your router
isp provided router
receiving hardware
tried multiple devices, both wireless and wired, even with an name brand external wireless antenna
Especially if you’re in an area with a lot of other wifi signals or radio frequency interference
Middle of nowhere countryside.
If it’s an ISP provided router you could probably ask for them to look at it
Tried, they gave me the Deny, defend, depose treatment
I would say to first try the speed on ethernet. If that’s slow, then it’s the service or the modem and not the router. I think even the worst router you can find would support at least 250 Mbps on Ethernet.
To see if it’s the router’s fault, you could try some high bandwidth local network transfer, with sftp or something. If that’s slow, if you have the money you can just buy one of those fancy gaming routers or some other highly reviewed one.
If there’s a few walls or floors in between you and the router that could be the problem and a fancier higher power router will help with that. Another thing that could help is installing another access point near where you’re device is, although that’s obviously a lot of effort.
If even ethernet is slow and they refuse to help you then if you’re in the US or Canada you can try submitting a complaint on the Better Business Bureau website. This actually helped us once or twice when dealing with some cellular problems. You wouldn’t think it would do anything but I guess sometimes it gets them to pay at least a little bit of attention to the problem.
I have heard about how bad and monopolistic rural Internet can be, good luck
will try over the next days, tysm!
Wifi is generally faster though, at least from phones. They often have horrible data transfer with MTP, and use USB2.0, so maybe 20-30MB/s real-world. Wifi is much faster, I usually get double that or more on my phone. Way more fun to transfer videos etc, and you don’t need to plug it to another device to push something to network storage.
How one would cut and paste videos from an android to a pc?
KDE Connect, when set up properly(pretty much does it automatically) alongside a linux system, you can access the entirety of your phone’s internal storage over LAN as if it were a network drive mounted on your PC.
Kde connect was also pretty slow for me, but not any slower than MTP
I was using the Windows version tho
deleted by creator
i hate mtp until i see the 6 hours remaining on a wireless transfer and then i love mtp
Probably not since the EU has made USB-C mandatory. What can change is the protocol that runs over those wires. Like how Thunderbolt uses the USB-C connector but is not a USB protocol
Mandatory for how long? Can’t be stuck with this shitty spec for years I hope?
Mandatory for the time being. They can change the directive if they deem it necessary in case of new tech.
Sad :(
Why?
Mandatory until the European Commission updates the standard. The law mandating the use of USB-C explicitly has a procedure for how to propose a new standard to supersede the current one.
deleted by creator
The connector is ‘ok’. It’s better than MicroUSB, MiniUSB and USB-A.
If only Tim hadn’t eschewed Steve’s wishes on Lightning though - it was supposed to be handed over to USB-IF as a royalty-free standard, instead Tim saw dollar signs and we all got a worse connector.
Reminder that lightning is strong enough to hold up a phone for display purposes, on it’s own.
Not unless they want to go bigger. The USB-C pin pitch is too closely spaced for the lowest tier of printed circuit boards from all major board houses.
You might have some chargers get deprecated eventually because there are two major forms of smart charging. The first type is done in discrete larger steps like 5v, 9v, 15v, or 21v. But there is another type that is not well advertised publicly in hype marketing nonsense and is somewhat hit or miss if the PD controller actually has the mode. That mode is continuously adjustable.
The power drop losses from something like 5v to 3v3 requires a lot of overbuilding of components for heat dissipation. The required linear regular may only have a drop of 0.4-1.2 volts from input to stable output. Building for more of a drop is just waste heat. If the charge controller can monitor the input quality and request only the required voltage for the drop with a small safety margin, components can be made smaller and cheaper. The mode to support this in USB-C exists. I think it is called PPS if I recall correctly. A month or two back I watched someone build a little electronics bench power supply using this mode of USB-C PD.
Yeah, Programmable Power Supply mode can be programmed (in realtime) to deliver from 3.3 to 21 volts in 20mV steps. For current im not totally sure how it works, i think you can set a limit.
There is an issue of some kind where the current limit is not reliable and requires additional circuitry. I think GreatScott YT was who went into that one.
What’s this about a pin pitch? Or drop losses. It sounds interesting but I don’t understand ☹️
Pin pitch is pin size and/or spacing. With physical plugs, you start to hit limitations with how small the wires can get while still being durable enough to withstand plugging/unplugging hundreds of times.
Drop losses. (I am keeping this at an ELI5 [more like ELI15, TBH] level and ignore some important stuff) Every electronic component generates heat from the power it uses. More power used usually means more heat. Heat requires physical space and lots of material to dissipate correctly. Depending on the materials used to “sink” (move; direct; channel) heat, you may need a significant amount of material to dissipate the heat correctly. So, you can use more efficient materials to reduce the amount of power that is converted to heat or improve how heat is transferred away from the component. (If you are starting to sense that there is a heat/power feedback loop here, it’s because there can be.) Since a bit of power is converted to heat, you can increase the power to your device to compensate but this, in turn, generates more heat that must be dissipated.
In short, if your device runs on 9v and draws a ton of power, you need to calculate how much of that power is going to be wasted as heat. You can Google Ohms Law if you would like, but you can usually measure a “voltage drop” across any component. A resistor, which resists electrical current, will “drop” voltage in a circuit because some of the current (measured in amperage) is converted to heat.
I kinda smashed a few things together related to efficiency and thermodynamics in a couple of paragraphs, but I think I coved the basics. (I cropped a ton of stuff about ohms law and why that is important, as well as how/where heat is important enough to worry about. Long story short: heat bad)
Pin pitch means how tiny the physical pins in the connector can be spaced apart.
IR drop losses happen because a wire has resistance, it isn’t a perfect conductor. 28AWG wire has about 0.22ohm/m. Given a 2 meter cable, you might expect to see 0.44ohm one-way. Current is also travelling back, so the circuit “sees” another 0.44ohm. That’s a total of 0.88ohm
A wire will cause voltage drop following ohm’s law. V=I/*R. So for 1A of current, you will see 0.88V lost.
Say you’re trying to charge at 15W (5V 3A), your phone is only going to ‘see’ 2.36 volts, and 7.9W are wasted in the cable.
For a 100W device (20V, 5A), 4.4V are lost, also meaning 22W are wasted.
(For others reading this, this is a perfect followup to my comment here explaining the “why”, while this is an excellent view into the “how” and picks up the bits I dropped about Ohms Law.)
Pin pitch is ultimately the spacing between traces. The traces are not as big of an issue as the actual spaces between the traces. This clearance is where things get tricky with making printed circuit boards. The process of masking off some circuit is not that hard. The way the stuff you want to keep is isolated from the copper you want to remove is the hard part. One of the issues is that you need an acid to take away the copper, but not the mask, but copper has a thickness. As the copper is etched away the acid moves sideways into the thickness too. Copper never etches completely uniformly either. The larger areas of open copper that need to be removed will etch much faster than a bunch of thinly spaced gaps. One of the tricks to design is finding ways to etch consistently with the process you build.
If you want to make super tiny traces that still have the right amount of copper and have all the gaps etched away consistently, the process of the etching toolchain becomes more expensive. You will need a stronger acid with a very good way of removing the etchant that is close to the copper and loaded with copper already. This is usually done with a stream of small bubbles, but it is risky because it could impact the adhesion of the masking material over the traces you want to keep. The stronger, hotter, and now agitated acid requires that the copper clad board is extremely clean and the photoresist used to mask the stuff you want to keep must be a very high quality. Also the resolution of this photoresist requites a much more precise form of UV exposure and development (about like developing old film photos).
So you need a better mask development toolchain, better quality photoresist. You might get away with not using photoresist at all in some other cheaper low end processes. You need the highest quality copper clad that etches more evenly, and you need a stronger acid to etch quicker straight down because a slower acid will move further sideways and ruin the thin traces to keep.
The pic has old school dip chips in a static resistant foam. Those are the classic standard 1/8th inch (2.54mm) pin pitch. The easiest types of boards to make yourself are like the island soldering style board with the blue candy soldered on. That is a simple coalpits oscillator for testing crystals. Then there are protoboards like the homemade Arduino Uno pictured. Then you get into the etched boards. Some of these were done with a laser printer toner transfer method. That is like the least accurate DIY and somewhat analogous with the cheapest boards from a board house. Others were made using photoresist. This method is more accurate but involved and time consuming. One of the boards pictured is a little CH340 USB to serial board with a USB micro connector. That is getting close to my limits for etching easily. Another board has a little LCD and text. There is a small surface mounted chip pictured on the foam and that is a typical example of what kinds of pin pitches are common for the cheapest level of board production. Now there are two USB-C female connectors pictured. One has a larger pin pitch and is made for USB 2.0 connections and power. However, that other one with all those tiny tiny connections at the back – that is a full USB-C connector. That thing is a nightmare for tiny pin pitch. There is also a USB-C male connector with a little PCB attached. These are the types of solutions people have tried to come up with where only some small board is actually of a much higher resolution. It is not the best example but I’m not digging further through stuff to find better.
The actual pins on the little full USB-C connector are inverted to be able to flip the connector. There is a scheme present to make this a bit easier to match up the connections but it is still a pain in the ass to juggle everything around. All of the data trace pairs are differential too, which basically means they must be the same length between the source and destination. So any time they are not equal, the shorter trace must zigzag around in magic space you need to find just to make them even.
USB-D’s NUTS