We’ve had general purpose computers for decades but every year the hardware requirements for general purpose operating systems keep increasing.
I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now where you need at least 8gb to have a decent experience.
What has changed are growing protocol specs that are now a bloated mess, poorly optimised programs and bad design decisions.
I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now
For general use/day to day stuff like web browsing, sure, I agree, but what about things like productivity and content creation? Imagine throwing a 4K video at a machine with 512 MiB RAM - it would probably have troubles even playing it, let alone editing/processing.
Video production is something you can do on a general purpose computer because it runs a flexible OS that allows for a wide range of use cases. As opposed to a purpose built embedded system that only performs the tasks for which it was designed. Hence, not general purpose. I believe this was their point anyway, not just like a computer for office work or whatever.
Video production is general purpose computing just like opening a web browser to look at pictures of cats is - it’s just that the former is way more resource intensive; it is done in software that runs on an OS that can run a dozen other things which in turn runs on a CPU that can usually run other OSes - as opposed to a purpose built system meant to do very specific things with software often written specifically for it.
We’ve had video editing software available to most personal computers since at least 1999 with imovie and 2000 with windows movie maker. IMO this is all general computer users need.
Professional level video production is not general computing, it’s very niche. Yes it’s nice that more people have access to this level of software but is it responsible.
The post does raise some real issues, increasing hardware specs is not consequence free. Rapidly increasing hardware requirements has meant most consumers have needed to upgrade their machines. Plenty of these could have still been in operation to this day. There is a long trail of e-waste behind us that is morally reprehensible.
You don’t need to be a “professional” to edit 4k videos at home, people do that every day with videos they took on their effing phone.
And that’s the point. What people do with their computers today requires far more resources than computers did in the late 90s. I’m sorry, but it’s completely idiotic to believe that most people could get by with 256 - 512MB of RAM.
“Morally reprehensible” give me a break, you simply don’t know what you’re talking about. so just stop.
My point is not that we should all go back to using old hardware right now with current the current way we use our tech because that is impossible.
My point is that the way we look at technology is wrong and the way we upgrade without real reason. The average person does not need a 4k camera, it does not make them a better photographer. I’ve used digital cameras with < 15 M sensors, the photos generally sufficed for family/holiday snaps and professional photography. Yet there will be people who have thrown out phones because they unnecessarily want the latest camera tech. Wait till people want 8k recording.
That perfectly working phone that was thrown out is an example of the e-waste I was talking about. Producing computers is not with out societal and environmental cost, and to throw perfectly serviceable machines is morally reprehensible. Current culture would agree with me that its not sustainable, but most people aren’t ready to have to keep their device for 5+ years.
Everyone should keep their current devices as long as possible (either the device breaks or can no longer run work related software) to reduce the upgrading culture.
You can shoot 4k now, that’s great! Keep the device even if the latest device supports 8k video. Same applies to other hardware/software features.
Somewhat agree. Manufacturers releasing successive models at less than a year’s interval now is ridiculous and you buying each new one - even more so, but on the other hand using the same phone for 5-6 years just because you can is also a bit drastic (even if you swap the battery midway through, by the time the second one’s dead the phone will be obsolete). Maybe a bit more doable with computers, especially given that you can upgrade one component at a time. 2-3 years seems doable for a phone.
We’ve had general purpose computers for decades but every year the hardware requirements for general purpose operating systems keep increasing. I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now where you need at least 8gb to have a decent experience. What has changed are growing protocol specs that are now a bloated mess, poorly optimised programs and bad design decisions.
I like to have more than one tab opened on my browser.
That used to be possible on less ram, blame OS, browser and web developers
For general use/day to day stuff like web browsing, sure, I agree, but what about things like productivity and content creation? Imagine throwing a 4K video at a machine with 512 MiB RAM - it would probably have troubles even playing it, let alone editing/processing.
Your original comment mentioned general purpose computers. Video production definitely isn’t general purpose.
What do you mean by productivity?
Video production is something you can do on a general purpose computer because it runs a flexible OS that allows for a wide range of use cases. As opposed to a purpose built embedded system that only performs the tasks for which it was designed. Hence, not general purpose. I believe this was their point anyway, not just like a computer for office work or whatever.
Yup, exactly this.
Video production is general purpose computing just like opening a web browser to look at pictures of cats is - it’s just that the former is way more resource intensive; it is done in software that runs on an OS that can run a dozen other things which in turn runs on a CPU that can usually run other OSes - as opposed to a purpose built system meant to do very specific things with software often written specifically for it.
We’ve had video editing software available to most personal computers since at least 1999 with imovie and 2000 with windows movie maker. IMO this is all general computer users need.
Professional level video production is not general computing, it’s very niche. Yes it’s nice that more people have access to this level of software but is it responsible.
The post does raise some real issues, increasing hardware specs is not consequence free. Rapidly increasing hardware requirements has meant most consumers have needed to upgrade their machines. Plenty of these could have still been in operation to this day. There is a long trail of e-waste behind us that is morally reprehensible.
You don’t need to be a “professional” to edit 4k videos at home, people do that every day with videos they took on their effing phone.
And that’s the point. What people do with their computers today requires far more resources than computers did in the late 90s. I’m sorry, but it’s completely idiotic to believe that most people could get by with 256 - 512MB of RAM.
“Morally reprehensible” give me a break, you simply don’t know what you’re talking about. so just stop.
My point is not that we should all go back to using old hardware right now with current the current way we use our tech because that is impossible.
My point is that the way we look at technology is wrong and the way we upgrade without real reason. The average person does not need a 4k camera, it does not make them a better photographer. I’ve used digital cameras with < 15 M sensors, the photos generally sufficed for family/holiday snaps and professional photography. Yet there will be people who have thrown out phones because they unnecessarily want the latest camera tech. Wait till people want 8k recording.
That perfectly working phone that was thrown out is an example of the e-waste I was talking about. Producing computers is not with out societal and environmental cost, and to throw perfectly serviceable machines is morally reprehensible. Current culture would agree with me that its not sustainable, but most people aren’t ready to have to keep their device for 5+ years.
So what are you suggesting - everyone to stick to 640x480 even though many smartphones today shoot 4K/60?
Everyone should keep their current devices as long as possible (either the device breaks or can no longer run work related software) to reduce the upgrading culture. You can shoot 4k now, that’s great! Keep the device even if the latest device supports 8k video. Same applies to other hardware/software features.
Somewhat agree. Manufacturers releasing successive models at less than a year’s interval now is ridiculous and you buying each new one - even more so, but on the other hand using the same phone for 5-6 years just because you can is also a bit drastic (even if you swap the battery midway through, by the time the second one’s dead the phone will be obsolete). Maybe a bit more doable with computers, especially given that you can upgrade one component at a time. 2-3 years seems doable for a phone.
🎶 JAVASCRIIIPT 🎶
Ya know I thought those were the pillarmen menacing symbols(I dont know japanese scripts), and ya know what it fits.
You have no clue what you’re talking about.
Actually does
I’m a software engineer but go off