Fuck that, instead of making them increase their imaginary “up to” numbers, make them advertise contractually guaranteed minimums. Id rather have a 25 mb minimum over a 100 mb maximum that usually sits around 8 mb.
When I bought internet services and colocated with major carriers every contract came with a Quality of Service rider that stipulated guaranteed quality and quantity of service. If my metrics fell below those minimums I had recourse. But, I could not extend that to my customers because they were using a shared resource I was providing. In general, though, I agree that there should be a QOS with every user connection.
and 20Mbps for upload
What we actually care about.
100Mb/s is still pretty abysmal.
A 4x increase for download and a 7x increase requirment for upload.
That’s a pretty solid improvement, honestly. They also have plans on whne to increase it to 1Gbps down/500Mbps up, so it seems like they are taking it seriously.
It’s long overdue and gigabit should be standard
It is long overdue, as the last update was 2015, when a democrat was President. The GOP refused to do it, and it took some time to seat a new FCC head due to Republican obstruction.
Gigabyte is coming, just not yet. This is a fine incremental step.
We should’ve had it when we paid for it, instead of telecom execs pocketing the money.
deleted by creator
Gigabit
my third world country’s internet has a minimum of 100mbps on most internet plans in the cities.
100mbps in the supposed best country in the world is shit, no matter how higher it is than 2003 standards.
lol I’ve never had anything over 12Mb/s. Currently have 8Mb/s, which costs roughly half than what I use to pay for 500kb/s
I would love to have 100Mb/s. Hell even half that.
Satellite?
DSL
I’m so sorry.
It’s interesting. I have a remote place (not where I live) in the least populated, podunkest county in the state (which is saying something). And we were still able to get fibre and 50Mbps out there (and it could be higher, but not really worth the extra money since it’s rarely used).
Still within a couple hours of a big city, though. Guessing you’re further away than that, or something?
The 500kbps was 15 minutes outside of a metro area of 2.5 million lol
It was decades of CenturyLink making sure no one else moved in on their turf.
Where I’m at now the fiber is a couple of miles away and no cable, but 8Mbps feels lightning fast after CenturyLink lol
That’s enough to watch exactly one 1080p 30fps stream on YouTube and literally nothing else.
That’s why I stream 720p when I can lol
100Mb/s is 800Mbps. This is 25Mbps to 100Mbps so 3.125mb/s to 8.33mb/s
Mbps = Mb/s = Megabits per second.
MBps = MB/s = Megabytes per second.
The p is just the /. It’s the capital or lowercase B that makes the difference.
Shit I found the one person who can actually remember the written difference bit and byte
As a computer engineer, I had better know. And don’t get me started on MiB vs MB
Please do I’d like to know more! ;)
kB = kilobytes = 1000 bytes
MB = megabytes = 1000 kB
kiB = kibibytes = 1024 bytes
MiB = mibibytes = 1024 kiB
Generally on hard drive/ssd capacity it will be listed in GiB (Gibibytes). This is the reason a 1 Terabyte drive is actually something like 931 GB showing in your system. Because your system uses GiB and the manufacturer uses GB.
1GB = 1,000,000,000 bytes
1GiB = 1,073,741,824 bytes
1 GB =~ 0.931 GiB
Edit: I had it backwards, it is fixed now
You messed it up, actually - it’s the bi units that are 1024
I hope ya know I was just messing with ya hahaha 🤣
- 3.125MB/s to 12.5MB/s
He is right though on megabits to megabytes. Internet speed is advertised in bits/s where files and transfer speeds are usually shown in software as megabytes/s
100mbps symmetric should be minimum standard. 100mbps down with 10mbps up is worse than remote islands with mud huts. Seriously, I was on a Pacific island that looked like what an after hurricane photo op does, and they had direct access to the fiber cables. So gigabit symmetric internet ONTs glued to the side of huts for a few bucks a month.
Cool, now make them use bytes as the system of measurement and we’ll be on to something.
I fear that will only happen when storage manufacturers are forced to use 1024 bytes per KB like everyone else.
In fairness it’s a very longstanding tradition that serial transfer devices measure the speed in bits per second rather than bytes. Bytes used to be variable size, although we settled on eight a long time ago.
1024 bytes per KB
Technically, it’s 1000 bytes per KB and 1024 bytes per KiB. Hard drive manufacturers are simply using a different unit.
Base 10 is correct and more understandable by humans. Everyone uses it except Windows and old tools. macOS, Android (AOSP), etc.
Found the hard drive manufacturer.
It’s 1024. It’s always been 1024. It’ll always be 1024.
Unless fo course we should start using 17.2GB RAM sticks.
There’s a conflict between the linguistic and practical implications here.
“kilo-“ means 1,000 everywhere. 1,000 is literally the definition of “kilo-“. In theory, it’s a good thing we created “kibi-“ to mean 2^10 (1024).
Why does everyone expect a kilobyte to be 1024 bytes, then? Because “kibi-“ didn’t exist yet, and some dumb fucking IBM(?) engineers decided that 1,000 was close enough to 1,024 and called it a day. That legacy carries over to today, where most people expect “kilo-“ to mean 1024 within the context of computing.
Since product terminology should generally match what the end-user expects it to mean, perhaps we should redefine “kilobyte” to mean 1024 bytes. That runs into another problem, though: if we change it now, when you look at a 512GB SSD, you’ll have to ask, “512 old gigabytes or 512 new gigabytes?”, arguably creating even more of a mess than we already have. That problem is why “kibi-“ was invented in the first place.
It’s not just the difference between kilo- and kibi-. It’s also the difference between bits and bytes. A kilobit is only 125 eight-bit bytes, whereas a kilobyte is 8,000 bits.
Computers run on binary, base 2. 1000 vs 1024, one is byte aligned(2^10), the other is not.
Thats an irrelevant technical detail for modern storage. We regularly use billions, trillions of bytes. The world has mostly standardized on base 10 for large numbers as it’s easy to understand and convert.
Literally all of the devices I own use this.
Altice (Optimum) took this opportunity to cut upload speeds from 35mbps to 20 under the guise of the “free upgrade”. You want your old upload speeds back? Oh that’s their most expensive tier now.
I’m dropping them, it was too unreliable for work from home. I pay twice as much now for fios
The “upgrade” they’re speaking of is to the cars of all the executives?
Same for my “XFinity” (Comcast) service. Literally the only plan with more than 20 up is the most expensive tier with 1200/35. Sadly, it has been that way for several years… but this year they had no choice but to jack up all rates across the board so the most expensive tier is now $30 more expensive ($90 -> $120). No other competition so… that’s that.
I care more for stability and low latency, not so much speed.
Offering me a faster cellular or satellite connections doesn’t interest me.
There are features of IPv6 that would help there. I actually think pushing that to be rolled out widely is more important than 1Gbps connections.
What about cable
He’s a mid Deadpool villain
I’m a Booster Gold man myself.
Not an option for everyone
*cries in Australian*
100Mbps is still very slow. Much better than 25Mbps, but still slow.
I have symmetric 1Gbps and do a LOT of data transfer (compared to 99.99% of people). And even then I rarely really would need or even notice more than 100Mbps.
For most people, in the real world, why is 100Mbps “very slow”?
Because downloading large files takes hours instead of minutes
A file large enough to take hours, plural, at 100Mbps is more than 90GB. Doing that regularly is definitely not normal usage.
Tell that to everyone who has played a Rockstar game.
Average 4K BDRIP movie is 60 GB, average AAA game is 60-100+ GB. So you are saying that watching movie once a week and downloading one game is not normal? Using 1 GBit Internet means saving 3-6 hours of time per week.
Watching movies and playing AAA games is normal, sure.
Downloading 4K BDRIPs and a new AAA game every week definitely isn’t. Most people probably stream their movies, and even those prone to pirating their content are likely downloading re-encoded copies, not full sized BDRIPs.
On top of that, it’s not like you have to sit there and wait for it. You’re only really saving that time if it’s time you were going to spend sitting and staring at your download progress instead of doing something else.
I’m not saying edge cases don’t exist where someone would notice a real difference in having >100Mbps, but it’s just that, an edge case.
Most of the time, the idea to watch a film comes to me quite suddenly, so I have to wait until the film is at least partially downloaded before I start watching it. And even downloading an app from the repository takes 10 times less time. And 1000 MBps internet is only 5-10 euros more expensive than 100 MBps.
The vast majority of people are not downloading multi GB files frequently
Because downloading GTA V takes 2 1/2 hours at 100Mbps, and 14 minutes at 1Gbps.
It’s amazing how much our views change with time. My dad was definitely a super early adopter of cable when it became available in our area, if I recall it was 16 Mbps which was unreal to me in 2002. I made do with 5 Mbps in uni and it was totally usable.
But now, I’ve had 1Gbps for years and wow it’s so different, changes your habits too. I don’t hoard installed games as much, I can pull them down in minutes so why keep something installed if I’m not going to use it?
I remember thinking, “How am I ever going to fill this 100MB hard drive? That’s so much space!” That was some time around 1997, I think.
My parents pay like 40 dollars per month for 1Mb down and like .2 Mb up
Shit, that should legitimately be illegal.
Do they also have to feed the pigeons carrying the data packets?
Aww, that’s cute.
-posted from my 768k $80/mo broadband.
I’d like to see a big government push to provide municipal services in every single metro area and extend it by whatever means into rural communities.
Xfinity keeps raising rates, I’m paying more now for just internet than the cost of basic cable, internet + digital voice was back in the 00s. While around 800 down, it’s still only about 40 something up, and has been like that for years and years.
I think we desperately need competition and if the government were to provide it, that’d be just fine.
But only 20 ̶d̶o̶w̶n̶ up !!
!! :-(
It really does suck, where I live the base plan gives you 300mbps down (which I know is pretty fast) but you are limited to 10mbps up. As much as they tout their speeds you’ll only get them if you pay top dollar.
Sounds like Spectrum where I live, on the bright side our 300 down is usually closer to 350 down, but also their 10 up is usually closer to 8. Meanwhile you have to dig to find the upload speeds when you sign up, even though they have the download speeds plastered everywhere. Honestly, there should probably be a rule that ISPs can’t list download speeds without upload speeds right next to it.
Yeah it is spectrum, the company is quite irritating and yeah they should be required to show both up and down speeds next to each other. For awhile I had t-mobile internet but the speeds were too inconstant so back to spectrum it was.
up*
I’m sitting here fine at 30Mbps. Can have two streams no problem.
I went from a 1.5/1 Gbps fibre connection down to a 20/10 Mbps when I moved. There is a MASSIVE difference. Rural internet is dog shit and no one cares
I honestly believe that is because rural areas are almost always represented by republicans, voted in by majority republican voters. both groups of which are extremely disinclined of making the entirety of human knowledge easily and quickly accessible, because then people might see how much things are better in other countries and start asking questions to their federal representatives.
They also fall prey to the classic “only one Internet provider” shit because of the whole “whoever pays to have the lines in owns those lines forevermore” shit we have here
It cost Comcast 10k to run a new line half a block to a place I lived 6 years ago, and that was in a rather empty part of my town.
Imagine how much it costs to run lines M I L E S to rural people’s homes. Who’s even going to try setting up there when someone else already has done it?
My area is controlled by Dems that are pretty lib, but thanks to how expensive it is to start an ISP we have literally 1 option for an almost 75 square mile area for non-sattelite Internet. Their max speed is 100 Mbps synch, and you have to fill out a PDF to get service (including putting s password for your account on said PDF, I put “fuck No im not” for mine for obvious reasons), and their techs will ignore service requests (they installed their stupid rental router and charged me monthly for it despite me saying not to) and lie (they said they couldn’t add my owned router to their list multiple times before someone finally took it’s fucking MAC address from me)
Your connection would not allow streaming one Blu-ray quality video stream, and good luck doing anything else in the connection while that is happening.
If your work sent you a 10gb file and you needed to send it back, it would take you 3 hours to do that. (With a functionally useless connection otherwise while downloading and uploading the file)
Downloading a popular game like baldurs gate 3 would take just under 9 hours.
Downloading it twice (to play with your spouse or kids) + updates, and then watching Netflix (which will cut into your download speed) while you wait for it download would toil away a weekend.
Nevermind the fact that slow Internet literally wastes away your life as you spend more micro moments just staring at blank and partially loaded websites.
Your download speed being fast or slow doesn’t mean the servers hosting the data you’re accessing or the DNS servers between you and that server are going to feed you data at that speed.
Game stores like Steam and GOG can provide download speeds of up to 1 Gbps or more, also torrents have no speed limits, it depends on the number of seeders
Yes, but normal websites might not. There’s no reason to if the amount of data being transferred is so small. Even large transfers, particularly streaming video providers, will have trouble feeding data to you at 1 Gbps simply because the network interface on the server might be saturated, the switch it’s connected to might have a slower CPU, the DNS server might be tossing your data into a queue or have a slower CPU itself. There are SO MANY hops between you and whatever data you’re trying to access, and every one of them influences the speed at which data will get to you. I’m not saying gigabit speeds aren’t worth paying for, but not everyone needs those speeds, especially if their ISP’s hardware isn’t up to snuff.
Yeah, I regularly hit about 80MBps (640Mbps) from Steam. I’m pretty close to their San Diego servers, so I get the good pipes. If I was closer, I’d probably be able to hit gigabit speeds.