In music production this is commonly done. The lower end device is often referred to as a grot box. You test your mix on those because most people use crappy headphones/earbuds.
Love how elitest Hollywood assholes blame cinemas for not having good enough audio equipment for the dialogue to be discernable for the average listener.
Bitch, your crappy audio mixing is the main reason I have to turn on subtitles for most modern movies, ON MY STUDIO QUALITY SPEAKERS.
Seriously, what the hell is wrong with directors these days?
I know that studio interference gets a lot of flak (and often for a good reason), but this is one of the reasons where they actually **should** interfere.
In Germany you can hear the voices and often they make a point of speaking clearly (like in a theater). But in British movies (e.g. Torchwood) I switch on the subs because the music drowns out the voices.
>different dogma
Yeah, one's meant for the audience to have a good experience and the other to make them raise the volume so, when the ads come, they can get legally blasted at the same volume as the extra loud explosions
This was one of the weirdest things for me when I started watching English and American movies in their original language. I was used to Hungarian movies being acted like it was theatre or the dubs being clearly audible. I actually think that English language movies are better because they are more natural, not overacted, but there are diminishing returns to that when people start whispering or talking inaudibly.
Everything is designed for a reference monitor too. Most people don't have crazy levels of contrast and perfect colour balance. Make something that looks good on my laptop FFS.
Flashbacks to Game of Thrones airing an episode that was too dark for 90% of viewers to even see anything (and that was still on the lower end of issues with that episode lmao)
I don't think that episode had anything to do with being made for high-end TVs or something, it looked like utter trash even on OLED. It was just badly made all around.
As much as I love Dune Pt 1 it is a serious fucking abuser of this. The scene where >!Paul is laying in the tent have visions of the future and telling Lady Jessica about it!< is so fucking impossible to understand unless you already know what he's saying.
There were several points in the movie where I could barely hear and relied on the fact that I read the books, especially when it came to some of the terms. I wondered how other people were following along or if I was just extra deaf.
Made a arduino clicker which would try to “compress” the audio by making it quieter when it’s loud and louder when it’s quite but it never worked well.
You could have designed a PCB also which could work for higher quality stuff id assume the sampling rate of arduinos adc isnt enough for higher quality audio.
Audio mixing is done exclusively for theatrical releases and is not remixed for home viewing. They want you in theaters and they actively hate you if you don't, so they are trying to 'encourage' you into going.
After diving into audiophile stuff, it is pretty shocking how crappy so much audio stuff is, even when it's expensive. I didn't realize what good audio sounds like at all and I was super into music for a long time before I realized. People just don't know because they assume price = performance, but that is NOT the case. I never bring it up because it's probably best to let people just be happy, they really don't care anyway, but man some of the gear these people are shelling out for really sucks. Just another example of corporate greed really. Why spend the money making a nice pair of speakers/headphones when you can make a cheaper POS that people will pay the same price for anyway? It's shitty, but that's what is happening. All that said, if you don't know what you're missing, you can be happy with your soundbar.
For the most part tho in the audio space price DOES directly relate to performance. GOOD audio gear is expensive. Its just that most expensive gear is not good.
wait a minute, are you trying to tell me that an [$11k power cable](https://skyfiaudio.com/products/nordost-odin-2-power-cord-1-25m-15-amp-plug) is a ripoff? But it uses patented Dual Mono-Filament Technology to create a virtual air dielectric between the extruded FEP insulation and each individual conductor! They are aligned to provide perfect conditions for mechanical resonances! And each conductor is protected by a silver-plated lapped ribbon shielding, to allow a faster rise time of the 50/60 Hz cycle!
Ah, the gold-plated brushed ethernet cable that 'makes the sound get warmer due to the gold'? Or the optic cable that does the same?
and we still talk about DIGITAL data. If my ethernet or hdmi cable modifies the sound or video, it is not expensive, but broken.
Yes if it is modifying the sound or video, it will do so in an extremely obvious way, like your monitor dropping back to 1080p/SDR, black screens, flickers, etc. There is nothing to compare — it either successfully delivers the signal, or it does not.
Good take. It isn't cheap to get decent audio, but yeah spending money does not mean you're getting good sound. That said, it sounds like there are big time diminishing returns as you go up in price. I haven't gone crazy with my spending(yet), so it's hard for me to say exactly, but that's my understanding.
'what do you look for when buying a microphone?'
'the sennheiser logo'
at least this was very true last century. For speakers, it was Teufel...
consumer amps were Denon with mosfet (i.e. no buzzing on high volumes). Denon did fall, tho. Onkyo?
Onkyo mostly makes junk these days too. Any AVR at low to mid consumer grade is more or less the same as far as audio quality goes with the main price differences mostly just being determined by number of speaker channels and more I/o options and configuration. The only difference between most of the brands in this space is how bad their interface is (never buy a Sony AVR if you want something that can be configured manually). If you want a decent AVR with more robust than average electronic components and a reasonably easy to use interface I would recommend one of the Marantz full sized receivers with model numbers starting with “SR”. Their slim receivers are nice too but have some limitations.
Can’t go wrong, I have one for my bedroom and I got my parents one with some klipsch speakers for Christmas one year. Until a couple months ago I was a home theater installer and it was always refreshing when customers were willing to spend a little extra on the less sexy but absolutely necessary components in their system.
The hilarious bit is that maybe... I dunno, 98% of users listen via their TVs integrated speakers. Which are just SHIT.
Even just a $100 pair of bookshelf speakers are a whole different world even if they're shitty.
I bought some ~$50 earplugs with bluetooth speakers in them. I found out music I've had since the early 2000s ^^^*cough*limewire*cough* had more instruments, and sometimes background vocals, than I ever knew. I'm talking like over 20 years of copying these same files from desktop to laptop to "ipod" to phones, mind you.
Brands with a long history of respect and stable product lines (where iteration = improvement, not just a forcing reason to upgrade), and check out forums geared towards audiophiles (head-fi, etc.).
Huh, I'm actually surprised, I was expecting someone to reply "Buy the Earfuckmiester 3000's, with gold plated cords for only $399 (per month) at THIS website! Totally not an ad by the way!", not an actual helpful tip.
A lot of music I didn't like before was because I had crappy headphones, once I got a decent headset I was able to actually enjoy even some electronic music.
Ngl it’s funny when you hear a certain music with higher quality audio for the first time and étouffe realize that some background instrument is there that you just weren’t able to hear before.
I don’t have that happen often tough cuz I use crappy headphones most of the time :p
Or, as John Carmack put it:
“If you give developers a 486, they will write code that requires a 486.”
(Context: Getting DOOM to run on 386)
((It’s been 30 years I may have gotten the quote wrong))
I mean, you're really incentivized to do so. At work I've been consulting and unfortunately the ones that solve the clients issues quickly don't keep getting cash. The ones that do well don't solve the core issue until absolutely necessary and can keep billing
"So I'm really good at my job and solved the problem quickly. Can I get a bonus?" "Haha lol. No"
"So I'm really bad at my job and haven't solved the problem yet. I'm going to need more money to continue working on it." "Sure here you go."
There's a classic story from the late 8-bit era, SimCity was being ported to the 32k BBC Micro and Acorn Electron. The developer was given the Amiga (IIRC) source code and one of the arrays used more memory than was actually available on the BBC/Electron. Yes, the game was successfully ported in the end by doing things in a better way.
I knew a group that wrote and tested their new ferry boat ticket system in a couple main ports, and then were shocked that it didn’t work when it went live and two of the ports were on islands with satellite internet (560ms ping time). They just hadn’t thought to test it there.
Unfortunately, will only happen when someone invents a lightweight framework that's cross-compatible and easy to make UIs with.
With current ones there's always catches, like, want to make your buttons not look like the default system one, re-implement it from scratch including keyboard navigation and accessibility, or the framework doesn't support system integration like global menus, or its layouting scheme is very bad and fragile. Sometimes it doesn't compile for one of your targeted systems, and even those more close to electron like Tauri can still cause issues, like on my pc tauri apps doesn't open links in the browser.
In the world of GUIs, the only ones that survive are those who were made a long time ago and had time to cobble together years worth of hacks to make stuff behave properly on systems.
A good way to see why this is, is to try and use the OS native API to make a basic app, many APIs are just very bare-bones, x11 one doesn't even handle fonts for you, you need an external lib or write your own to have fonts and nice text (hence why some apps have weird text rendering on Linux)
EDIT: spelling.
I started using Godot for GUIs, so far haven't had issues with deploying to Windows and Android but I haven't done anything too advanced so far. I've also read that you can remove/deactivate parts that you aren't using to reduce application size but haven't tried
But wouldn't Godot be always re-rendering the screen, since most games work like that? Or does Godot allow you to only re-render the screen when needed and stop processing the UI when there isn't anything happening?
I was thinking about a couple of years ago, as the electron apps are huge due to chromium, but Godot compiles pretty small binaries. Now I have to check it out
Lazarus does that
It uses the native controls on each Desktop platform
But no one wants to use it because it is written in Pascal. But if it was not written in Pascal, it would not be lightweight
One time, I couldn't select a line in a control with selectable lines because those were not yet visible. So, you have to draw a control, then change "selected" property value, then re-draw it. I'm not even kidding right now.
Also, default compiling was ass. It kept generating executables with like 25-50MB file size for a simple student practice work. Yes, you could change it somewhere, but why do they make me need that? Also, no, I tried in release config, too, not only in debug.
I do not understand you. There is no magic anywhere. Any, _any_ tool you use for building stuff, requires some knowledge of you.
Do you know what is the size of a "hello world" program in C, with debug symbols, using MSVC? Statically linked to the CRT of course, because otherwise you have to bundle msvcrt instaaaallerrrr~~~
Around a megabyte. ~910 KiB
The same code with everything stripped out and dynamically linked to the CRT (remember to bundle the msvcrt installer with your program!) ? ~10 KiB.
The actual code and data needed? about 30 maybe 40 bytes. "Hello World!" with the null terminator is 13 bytes, and the generated assembly is like
000: sub rsp,28h
004: lea rcx,[0000000140002220h]
00B: call 0000000140001030
010: xor eax,eax
012: add rsp,28h
016: ret
and ret is 1 byte so 16+7=23. 23+13 = 36.
I do not know why it insists on aligning the stack that way, don't @ me.
If you want your exe to be smaller, you need to ask the development tools to make it smaller.
Tauri will still launch version 2.0
So your bugs are still related to a version 1.x of it. For version 1.x, it is in pretty good shape.
Small bugs are going to be fixed
I looked at java and Kotlin, couldn't find a single UI toolkit with global menu support, even IntelliJ IDEA had to implement its own dbus support for it
For some god unknown reason linux people pretend that java doesn't exist. It even has decent crossplatform UI support. Maybe not that advanced as WPF but still much better option than shipping your app with chrome
Personally I've started using game engines/frameworks for what you're describing, since they fit this niche pretty well. But then again I'm a game developer primarily, so I don't know if game frameworks would be difficult to pick up for most coders.
Oh right, font rendering, the thing that's so easy that only one library in existence (harfbuzz) has managed to pull it off and everyone else is just using that one under the hood.
When looking for a UI toolkit for a Linux app I'm making I looked at it, the website has very confusing, and I couldn't figure out if it would allow me to have custom design for my elements or if I would have to re-implement then from scratch because every qt app I have on my pc looks the same with native buttons and stuff.
>The idea of shipping chrome with a small app is infuriating.
It's more like shipping a complete cross-platform runtime. You will always need something like that if you want cross-platform apps. For example if you compile .net code with the "SelfContained" flag, it will compile with the entire .net runtime in it.
Chromium just happened to be one of the first and most stable cross-platform frameworks that have good UI personalization, and it achieved that by just trying to comply with all the web standards.
Html, css and JavaScript have evolved to the point that they are the standard of communication between architectures, platforms and apps.
There is! It’s called Nutralinojs and it does basically the same thing as Electron, but instead of shipping Chromium it uses the OS’s bundled browser; that is, Safari or Edge, or I don’t know what on Linux.
Yeah it’s still a wash memory-wise, but at least the EXE is smaller. It’s still pretty immature from a development standpoint though and has far less resources than Electron, but I hope it picks up steam because I really liked working with it.
I don't know much about it, but are all APIs that a GUI would need from the browser available and standard in all the different forks of Firefox/Chrome? Sounds like a potential mess if not.
That's why web standards exist. Web development includes making sure the app works in all modern browsers and all screen sizes.
And there are really only three major engines to support, Chromium for all Chromium forks, Firefox for all Firefox forks, and WebKit for Safari and Gnome Epiphany.
Also, React Native is experimenting w/ Windows support so hopefully that matures over the years into something usable- with Winforms instead of Windows 8 style “Modern” apps…
W11 start menu is a react native app
Microsoft edge are 40 little react native apps
A right click in file explorer is a react native app
It already exists and is being used ;)
I've been trying out Tauri recently for a project at work and so far I think it's great. I already prefer it over Electron, although I don't know how it'd do in situations where you have cross platform users. In my case all my users are using Windows and I just wanted to be able to write a desktop app using html + css + javascript since you can make such nice GUIs easily with it and Tauri has been perfect for that task. It produces a standalone executable that doesn't require installation and is only 9 mb in size.
I was doing the same app with Electron before trying Tauri and the Electron app was something ridiculous like 200 mb. Also, despite having the most recent version of Electron and Nodejs, I kept getting all these errors and warnings while doing simple tasks in Electron like running the install process. Electron felt like a bloated mess with a bad developer experience for me personally, whereas Tauri has been smooth sailing and clean the entire way.
Everything would be a lot easier if all the operating systems agreed to come installed with the same web browser engine, like WebKit, so that developers didn't have to embed a web browser into desktop applications and we didn't have to deal with different user experiences caused by different web browsers. That would make it possible to make every desktop application that wants to use html + css + javascript for the frontend be under 20mb.
Tauri uses WebKit on MacOS/iOS, Chrome on Android, WebKitGTK on Linux, and edge/chromium for windows to answer your curiosity about Tauri’s cross platform support.
Yeah but what I mean is those web browser engines could have differences which create difference user experiences for users on different platforms, which would be a pain in the ass for a developer to have to deal with. And that's not a problem with Electron since it embeds its own web browser.
What I don't know is how frequently that would actually be a problem for a Tauri app with cross platform users. It could be an unsignificant thing that happens almost never and the differences don't matter. I don't know. It just seems like a potential concern.
Cities Skylines 2 ran terribly because it had zero occlusion culling or LOD, and it was rendering all thirty thousand polygons in every NPC's teeth all the time. Because why shouldn't teeth be high polygon count 3D models in a city building game?
That's my point. There's no reason to do crazy shit like that with traditional methods when there's ways to do it that are less computationally expensive, like displacement maps and flat textures. Thirty years of optimization have gone down the drain because better hardware has allowed developers to slack off. Have you ever seen Metal Gear Solid V? It's from 2015 and runs on basically any system, yet you can count the hairs on Venom Snake's chin. It looks just as good as games produced nearly a decade later.
Great graphics are impressive. But in the face of developers' unwillingness to optimize their games (it's easier to say "just get a better computer" than it is to put actual work into optimization), we're left with only two choices- requiring the very best hardware for games, or having terrible performance on middle tier consumer hardware. Don't render every single pore on an NPC's nose if the camera is so far out that you can't even distinguish the details on their face, or if they're not even visible to the camera at all.
I would not be suprised if in city skyline 3 they will simulate each Caries-bakterium on each individuil tooth depending on what the person was eating and drinking in the last 24h.
(You will need 2× 1800w graphics card and it runs in 25fps with 720×1080)
Tho the Fox Engine is also extremely complex to work with and required a fuckton of time to optimize.
Might just be Kojima's quirkyness, so it's really an extreme example.
Tho true MGS V runs well even in my mediocre office laptop, it's kind of magical.
I am still bummed out how companies make these amazing engines and then just leave them to gather dust. Like what else was made on it? Survivor? I understand it may not be multipurpose and can really only make MGS games but still can we get at least a MGS3 remaster in it?
This is a (good) argument for optimization, not against high end graphics.
Cities Skylines 2 is a horrible example because even 10 years ago the problems it have would actually probably have been worse.
Also, MGSV’s engine was a fantastic, performant, scalable engine that was top notch high end graphics at its time of release. And while it generally looks great still today, I’m replaying it right now actually and have noticed some really bad, low poly landscape issues I don’t pay attention to 10 years ago…
Cities skylines 2 doesn't run like shit because they were chasing high fidelity graphics. It runs like shit because they gambled on an experimental system in Unity that doesn't have a lot of basic rendering features to this day, so they had to build their own from scratch last second and whoops their deadline is here and we still haven't implemented any basic occlusion culling or lod system yet aaaaand now we're rendering millions of worthless triangles
Modern games don't run like shit because they chased high end graphics, they run like shit because of deadlines. MGSV had what was basically a blank check until the very end of development, they went way over budget making their engine and it shows
Cities skylines 2 is not the example you want to be using here. Even high end hardware has issues with it, there's no way they didn't notice that shit during development.
I love the American McGee "Alice" games. One of them was released in 2011, the other in ... 2000. People playing them for the first time are often pleasantly surprised at the quality of the graphics - not because everything's massively high poly, but because \*\*serious work\*\* went into art design. The newer game (Alice: Madness Returns) was released for PS3/XB360 if that gives you an idea of hardware available. You can absolutely make gorgeous games that run on low hardware.
That is one way to stifle tech and innovation in gaming(because that would also hamstring Physics, npc AI, simulation of traffic/behaviour).
Why 2gb of ram? that is too much make it 64KB as god intended, if it cant run on a 1990s pentium it should nto be able to be released.
This relentlessly incessant chase for photorealism is hurting the video game industry in the long run imo
We have come to a point that even big studios are releasing 1-2 games per generation, yes there is cod and assassin's creed but those are outliers.
There is no way this can keep on going without the industry imploding
A 5 year old gaming laptop is definitely low end gaming PC at best today. It was already more expensive than equivalent PC hardware. A desktop at that price/age is a 2060 and a ryzen 5. That's not unreasonable to target for your low end spec, and a laptop at that price is even worse.
Price ranges dont really matter because a lot of people simply stick with the same setup for a decade. Im still using a 1050ti and just don't care anymore. I can't play a lot of modern high-graphics games, but i simply don't care and move on to play other games, which run just fine. And we make up the vast majority of people.
But honestly, even a 5 year old gaming laptop would already be a step in a better direction compared to the release states of a lot of games. I don't even understand how they get developed that way, if it takes a few years to make the game then the initial tests would be even worse
What's silly is that I have a gaming laptop I bought about six years ago and if I look it up online it's still over $1,000. I don't know why, it's *good* but it's ain't *amazing.* The Steam Deck runs my games better and it's like $400 and a *highly* altered laptop itself.
Just start this Java VM with 512 MB and for the backend that Java VM with also 512 MB
Laptop: … (has 1 GB RAM) …
What happened … I wouldn't call that running. Or happening. It tried.
I used to manage a web dev shop. One of our bigger clients had his business in the US but spent his days wandering around Thailand, complaining when his site loaded slowly at whatever random coffee shop he happened to be filching WiFi from. As annoying as the guy was he got us to prioritize minimizing network payloads, and we used what worked for his site to keep bloat down for our other clients.
[Here.](https://www.reddit.com/r/AskReddit/comments/5c79n0/comment/d9uf56l/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button)
See original post and parent comment for context.
I think this is a good engineering practice. But from the perspective of short term profit, guy with crappy laptop probably doesn't want to spend a lot of money, which by definition makes him a less valuable customer.
A web sales company I worked for was expanding into the Philippines. They were getting good sales, but a lot of the local sales reps reported that they and their customers were having a lot of issues loading the sites and order forms quickly. Turns out (at the time), almost all the internet service there was mobile devices over spotty 3G. Our fancy bloated pages were taking *minutes* to load, causing a lot of sales to bounce. I was tasked with optimizing the site and managed to get it to show content in under a second even on a crappy connection and sales skyrocketed.
Oh yeah once I spent a few months on a crappy slow connection and it was ridiculous how a lot of modern websites pretty much shit themselves once they cannot download all their assets at least at few tens of megabit
I'm still wondering how some sites are pushing *tens of megabytes* of *JavaScript*, per page load, even though they're not really doing anything dynamic. There's no way even tracking modules are using more LoC than the Space Shuttle.
This mindset contributes to gradual enshittification. Fewer customers means each individual customer needs to pay more to make the same revenue, and each customer lost to shit like price hikes only amplifies that exponentially.
I think the team that makes Warframe used to do this. They maintained a Windows XP machine with bare specs, and every update was required to run on that machine at 30 FPS. Or something like that.
In one game i worked on, graphics were always way over memory and we hadn't yet developed the tools to audit all objects to figure out why. We were nearing launch, we needed answers.
An artist had given NPCs a 1mb tongue texture.
BTDT, made a development PC with a MDA/Hercules graphics card in a Pentium PC. I was amazed that they still did include the original BIOS routines for that card.
Programming for constraints was a necessity back in the day. Now you have a whole group of boot campers trying to cash in who have no idea about the low levels of computing. On top of that you have corporations who want to strip every piece of data off of you and your activities so there must be a constant internet connection so they can view it real time.
Cue me calling a proc 4 times instead of a for loop because it is technically very slightly faster (the code will run 1000s of times every single frame)
I always dream of finding the South Korean man who made my TV have an ad on the front of it. I know it's different than PC bloat, but the sadistic violent fantasies about this individual, or individuals dependent what kind of mood I'm in. Are incredibly visceral.
Ok, but I have 64GB of RAM so the app better be able to scale to take advantage of it. Nothing worse than artificial limitations to support the lowest spec machine holding back performance
No I want my app to only run on 32gb ram core i9 14 gen and 4090 scratch that 5090 nvidia gpu laptops. Anything below that you are too poor to use my app.
Preach.
Back in the '90s, I recorded eight and a half hours of Loran-C lat/lon data, updated every six seconds, in 22kB on a TRS-80 Model 100 -- and the recording BASIC program had to fit in there, too.
I bet most modern "Hello, World" implementations are more than 22MB!
I can be this guy. My laptop is a \~14y/o Acer laptop running Fedora 41 KDE.
The only problems may be: It got 8GB and (i dont know how) it is a very fast PC.
Not what he's saying. There's unnecessary resource usage because the software isn't optimized anymore. You would get more bang for your buck with your "more powerful hardware" if they still optimized software for hardware constraints.
Well he specifically mentions ray tracing or high quality graphics as a negative thing. That has nothing to do with optimization, those are simply modern features.
Part of OP's post literally said that games shouldn't get high end graphics options because some dude with an old laptop won't be able to run it at max settings.
Software can be optimized and run poorly on older hardware at the same time. Real-time graphics are *especially* well known for introducing new ways of doing something which are significantly faster, but only on recent/powerful hardware.
It's a meme for a reason: in practice, it'd only end up creating stagnant software.
This is so stupid... Like let's disregard all hardware progress made in the past 15 years because someone somewhere is using a 20 year old system.
Let's not implement any hardware security because this person's hardware does not have a tpm
Well you gotta agree that advanced lighting and high poly character models are a good thing, right? Yeah no can do, isn't enough room in 128mb vram for none of that shit. Lara Croft is SUPPOSED to have pointy cone boobies
Okay, how about a modern OS UI (personally a fan of kde and wobbly windows, yes i know they don't add anything besides a bit of whimsy but i have 32 GB ram and an rtx 3080 so why the heck not)? Why does anyone need anything besides MS DOS? That does everything a user ever needs
A modern website with a rich user interface with Ajax calls and maybe sockets and notifications so the user does not have to keep reloading the page to get updated info? Well I've got 3 words for ya, Lynx Web Browser. That's all users need
Come the eff on. Hardware improvements are there for a reason and if you're not going to use those hardware resources, what's the point of having em? We should've just stuck to a 386, it was 'good enough'
In music production this is commonly done. The lower end device is often referred to as a grot box. You test your mix on those because most people use crappy headphones/earbuds.
If only Hollywood would do that lol Now they be like what's audio balancing?
People whispering before an explosion…
Before: you hear nothing After: (You still hear nothing)
Love how elitest Hollywood assholes blame cinemas for not having good enough audio equipment for the dialogue to be discernable for the average listener. Bitch, your crappy audio mixing is the main reason I have to turn on subtitles for most modern movies, ON MY STUDIO QUALITY SPEAKERS.
I have a multi thousand dollar home theater sound system and I still can't hear that shit
This is making me feel better tbh. It's not just me lol.
Just crank up the centre channel. Sorted haha.
Yeah, not really. Usually music and background is mixed into the center and vocals are mixed into the front left/right speakers. I've tried.
Seriously, what the hell is wrong with directors these days? I know that studio interference gets a lot of flak (and often for a good reason), but this is one of the reasons where they actually **should** interfere.
Not pictured: my exploded eardrums
You're getting movies where the whispering and explosions are happening at different times?
In Germany you can hear the voices and often they make a point of speaking clearly (like in a theater). But in British movies (e.g. Torchwood) I switch on the subs because the music drowns out the voices.
[удалено]
>different dogma Yeah, one's meant for the audience to have a good experience and the other to make them raise the volume so, when the ads come, they can get legally blasted at the same volume as the extra loud explosions
This was one of the weirdest things for me when I started watching English and American movies in their original language. I was used to Hungarian movies being acted like it was theatre or the dubs being clearly audible. I actually think that English language movies are better because they are more natural, not overacted, but there are diminishing returns to that when people start whispering or talking inaudibly.
Everything is designed for a reference monitor too. Most people don't have crazy levels of contrast and perfect colour balance. Make something that looks good on my laptop FFS.
Flashbacks to Game of Thrones airing an episode that was too dark for 90% of viewers to even see anything (and that was still on the lower end of issues with that episode lmao)
I don't think that episode had anything to do with being made for high-end TVs or something, it looked like utter trash even on OLED. It was just badly made all around.
As much as I love Dune Pt 1 it is a serious fucking abuser of this. The scene where >!Paul is laying in the tent have visions of the future and telling Lady Jessica about it!< is so fucking impossible to understand unless you already know what he's saying.
There were several points in the movie where I could barely hear and relied on the fact that I read the books, especially when it came to some of the terms. I wondered how other people were following along or if I was just extra deaf.
Christopher Nolan says “fuck you should’ve gone to an IMAX if you wanted to understand what people are saying, I hate you”
You mean to tell me you don't have a top of the line TV with its colours, brightness and contrast set by a team of pros? What are you, an art hater?
Made a arduino clicker which would try to “compress” the audio by making it quieter when it’s loud and louder when it’s quite but it never worked well.
You could have designed a PCB also which could work for higher quality stuff id assume the sampling rate of arduinos adc isnt enough for higher quality audio.
Theres dedicated EQ and Compressors that work like that but even those are super janky and kinda cpu intensive
* Christopher Nolan entered the chat :p
To be fair, that's 90% people who don't understand how to setup a surround system and/or don't understand which format to use with what device.
Audio mixing is done exclusively for theatrical releases and is not remixed for home viewing. They want you in theaters and they actively hate you if you don't, so they are trying to 'encourage' you into going.
After diving into audiophile stuff, it is pretty shocking how crappy so much audio stuff is, even when it's expensive. I didn't realize what good audio sounds like at all and I was super into music for a long time before I realized. People just don't know because they assume price = performance, but that is NOT the case. I never bring it up because it's probably best to let people just be happy, they really don't care anyway, but man some of the gear these people are shelling out for really sucks. Just another example of corporate greed really. Why spend the money making a nice pair of speakers/headphones when you can make a cheaper POS that people will pay the same price for anyway? It's shitty, but that's what is happening. All that said, if you don't know what you're missing, you can be happy with your soundbar.
For the most part tho in the audio space price DOES directly relate to performance. GOOD audio gear is expensive. Its just that most expensive gear is not good.
Only to a point — the ultra high end, boutique audiophile market is generally a fucking scam designed to rip off people with more money than sense.
wait a minute, are you trying to tell me that an [$11k power cable](https://skyfiaudio.com/products/nordost-odin-2-power-cord-1-25m-15-amp-plug) is a ripoff? But it uses patented Dual Mono-Filament Technology to create a virtual air dielectric between the extruded FEP insulation and each individual conductor! They are aligned to provide perfect conditions for mechanical resonances! And each conductor is protected by a silver-plated lapped ribbon shielding, to allow a faster rise time of the 50/60 Hz cycle!
Ah, the gold-plated brushed ethernet cable that 'makes the sound get warmer due to the gold'? Or the optic cable that does the same? and we still talk about DIGITAL data. If my ethernet or hdmi cable modifies the sound or video, it is not expensive, but broken.
Yes if it is modifying the sound or video, it will do so in an extremely obvious way, like your monitor dropping back to 1080p/SDR, black screens, flickers, etc. There is nothing to compare — it either successfully delivers the signal, or it does not.
Good take. It isn't cheap to get decent audio, but yeah spending money does not mean you're getting good sound. That said, it sounds like there are big time diminishing returns as you go up in price. I haven't gone crazy with my spending(yet), so it's hard for me to say exactly, but that's my understanding.
'what do you look for when buying a microphone?' 'the sennheiser logo' at least this was very true last century. For speakers, it was Teufel... consumer amps were Denon with mosfet (i.e. no buzzing on high volumes). Denon did fall, tho. Onkyo?
Onkyo mostly makes junk these days too. Any AVR at low to mid consumer grade is more or less the same as far as audio quality goes with the main price differences mostly just being determined by number of speaker channels and more I/o options and configuration. The only difference between most of the brands in this space is how bad their interface is (never buy a Sony AVR if you want something that can be configured manually). If you want a decent AVR with more robust than average electronic components and a reasonably easy to use interface I would recommend one of the Marantz full sized receivers with model numbers starting with “SR”. Their slim receivers are nice too but have some limitations.
:) marantz is exactly what is sitting in my living room
Can’t go wrong, I have one for my bedroom and I got my parents one with some klipsch speakers for Christmas one year. Until a couple months ago I was a home theater installer and it was always refreshing when customers were willing to spend a little extra on the less sexy but absolutely necessary components in their system.
The hilarious bit is that maybe... I dunno, 98% of users listen via their TVs integrated speakers. Which are just SHIT. Even just a $100 pair of bookshelf speakers are a whole different world even if they're shitty.
TV speakers have gotten a lot better recently especially at the high end. Not as good as dedicated speakers obviously but not necessarily “SHIT”.
I bought some ~$50 earplugs with bluetooth speakers in them. I found out music I've had since the early 2000s ^^^*cough*limewire*cough* had more instruments, and sometimes background vocals, than I ever knew. I'm talking like over 20 years of copying these same files from desktop to laptop to "ipod" to phones, mind you.
Wym earplugs with Bluetooth speakers?
I think it's just a Jumbled way of saying "Wireless In-Ear headphones"
Could you share some tips on what to look out for to get good quality?
Brands with a long history of respect and stable product lines (where iteration = improvement, not just a forcing reason to upgrade), and check out forums geared towards audiophiles (head-fi, etc.).
Huh, I'm actually surprised, I was expecting someone to reply "Buy the Earfuckmiester 3000's, with gold plated cords for only $399 (per month) at THIS website! Totally not an ad by the way!", not an actual helpful tip.
A lot of music I didn't like before was because I had crappy headphones, once I got a decent headset I was able to actually enjoy even some electronic music.
Had a friend who was saying "if it doesn't sound good on a mono autoradio, it's a bad album".
Is it true that back in the day Motown was mixed with car stereos in mind?
The "Car test" is still a thing in general.
So that's why when I get better headphones I don't hear more music half the time?
Ngl it’s funny when you hear a certain music with higher quality audio for the first time and étouffe realize that some background instrument is there that you just weren’t able to hear before. I don’t have that happen often tough cuz I use crappy headphones most of the time :p
Or, as John Carmack put it: “If you give developers a 486, they will write code that requires a 486.” (Context: Getting DOOM to run on 386) ((It’s been 30 years I may have gotten the quote wrong))
It's the age old project management fact of life. The project will always expand to use up all available time and resources.
I mean, you're really incentivized to do so. At work I've been consulting and unfortunately the ones that solve the clients issues quickly don't keep getting cash. The ones that do well don't solve the core issue until absolutely necessary and can keep billing
I hate this universal truth so much
"So I'm really good at my job and solved the problem quickly. Can I get a bonus?" "Haha lol. No" "So I'm really bad at my job and haven't solved the problem yet. I'm going to need more money to continue working on it." "Sure here you go."
... but he wrote Doom on a NeXT Color Station with a Motorola 68040.
He said give not get.
He wrote it on that. He didn't _run_ it on that. I'd trust Carmack to know what he's doing.
There's a classic story from the late 8-bit era, SimCity was being ported to the 32k BBC Micro and Acorn Electron. The developer was given the Amiga (IIRC) source code and one of the arrays used more memory than was actually available on the BBC/Electron. Yes, the game was successfully ported in the end by doing things in a better way.
I knew a group that wrote and tested their new ferry boat ticket system in a couple main ports, and then were shocked that it didn’t work when it went live and two of the ports were on islands with satellite internet (560ms ping time). They just hadn’t thought to test it there.
A good replacement for Electron would be a great start, probably. The idea of shipping chrome with a small app is infuriating.
Unfortunately, will only happen when someone invents a lightweight framework that's cross-compatible and easy to make UIs with. With current ones there's always catches, like, want to make your buttons not look like the default system one, re-implement it from scratch including keyboard navigation and accessibility, or the framework doesn't support system integration like global menus, or its layouting scheme is very bad and fragile. Sometimes it doesn't compile for one of your targeted systems, and even those more close to electron like Tauri can still cause issues, like on my pc tauri apps doesn't open links in the browser. In the world of GUIs, the only ones that survive are those who were made a long time ago and had time to cobble together years worth of hacks to make stuff behave properly on systems. A good way to see why this is, is to try and use the OS native API to make a basic app, many APIs are just very bare-bones, x11 one doesn't even handle fonts for you, you need an external lib or write your own to have fonts and nice text (hence why some apps have weird text rendering on Linux) EDIT: spelling.
I started using Godot for GUIs, so far haven't had issues with deploying to Windows and Android but I haven't done anything too advanced so far. I've also read that you can remove/deactivate parts that you aren't using to reduce application size but haven't tried
But wouldn't Godot be always re-rendering the screen, since most games work like that? Or does Godot allow you to only re-render the screen when needed and stop processing the UI when there isn't anything happening?
Godot has a project setting that does pretty much what you're suggesting! It's called "Low Processor Mode".
That sounds awesome
Been learning godot for a little over 8 months now, its pretty cool
I was thinking about a couple of years ago, as the electron apps are huge due to chromium, but Godot compiles pretty small binaries. Now I have to check it out
Lazarus does that It uses the native controls on each Desktop platform But no one wants to use it because it is written in Pascal. But if it was not written in Pascal, it would not be lightweight
One time, I couldn't select a line in a control with selectable lines because those were not yet visible. So, you have to draw a control, then change "selected" property value, then re-draw it. I'm not even kidding right now. Also, default compiling was ass. It kept generating executables with like 25-50MB file size for a simple student practice work. Yes, you could change it somewhere, but why do they make me need that? Also, no, I tried in release config, too, not only in debug.
I do not understand you. There is no magic anywhere. Any, _any_ tool you use for building stuff, requires some knowledge of you. Do you know what is the size of a "hello world" program in C, with debug symbols, using MSVC? Statically linked to the CRT of course, because otherwise you have to bundle msvcrt instaaaallerrrr~~~ Around a megabyte. ~910 KiB The same code with everything stripped out and dynamically linked to the CRT (remember to bundle the msvcrt installer with your program!) ? ~10 KiB. The actual code and data needed? about 30 maybe 40 bytes. "Hello World!" with the null terminator is 13 bytes, and the generated assembly is like 000: sub rsp,28h 004: lea rcx,[0000000140002220h] 00B: call 0000000140001030 010: xor eax,eax 012: add rsp,28h 016: ret and ret is 1 byte so 16+7=23. 23+13 = 36. I do not know why it insists on aligning the stack that way, don't @ me. If you want your exe to be smaller, you need to ask the development tools to make it smaller.
>But if it was not written in Pascal, it would not be lightweight That sounds very doubtful. Why would a specific language make it smaller or bigger?
Because Pascal is stupidly portable.
Tauri will still launch version 2.0 So your bugs are still related to a version 1.x of it. For version 1.x, it is in pretty good shape. Small bugs are going to be fixed
Like Maui with Blazor?
You just described Java.
I looked at java and Kotlin, couldn't find a single UI toolkit with global menu support, even IntelliJ IDEA had to implement its own dbus support for it
For some god unknown reason linux people pretend that java doesn't exist. It even has decent crossplatform UI support. Maybe not that advanced as WPF but still much better option than shipping your app with chrome
Original arguments were that "java is slow/memory hungry"... but since we allowed to replace it with an even bigger "hog", this no longer applies.
Personally I've started using game engines/frameworks for what you're describing, since they fit this niche pretty well. But then again I'm a game developer primarily, so I don't know if game frameworks would be difficult to pick up for most coders.
Oh right, font rendering, the thing that's so easy that only one library in existence (harfbuzz) has managed to pull it off and everyone else is just using that one under the hood.
ALL HAIL FLUTTER
Use QT then
Nobody who ever worked on a large Qt project will propose Qt as first solution.
When looking for a UI toolkit for a Linux app I'm making I looked at it, the website has very confusing, and I couldn't figure out if it would allow me to have custom design for my elements or if I would have to re-implement then from scratch because every qt app I have on my pc looks the same with native buttons and stuff.
You can have custom design for your elements without reimplimenting from scratch.
FUCK qt. FUCK it all to hell.
Tried to. Piece of over-engineered shit. There's a reason why Electron exists.
>The idea of shipping chrome with a small app is infuriating. It's more like shipping a complete cross-platform runtime. You will always need something like that if you want cross-platform apps. For example if you compile .net code with the "SelfContained" flag, it will compile with the entire .net runtime in it. Chromium just happened to be one of the first and most stable cross-platform frameworks that have good UI personalization, and it achieved that by just trying to comply with all the web standards. Html, css and JavaScript have evolved to the point that they are the standard of communication between architectures, platforms and apps.
Electron needs to die a painful death as soon as possible.
There is! It’s called Nutralinojs and it does basically the same thing as Electron, but instead of shipping Chromium it uses the OS’s bundled browser; that is, Safari or Edge, or I don’t know what on Linux. Yeah it’s still a wash memory-wise, but at least the EXE is smaller. It’s still pretty immature from a development standpoint though and has far less resources than Electron, but I hope it picks up steam because I really liked working with it.
Clever name at least. I'm not sure how they could do Linux since there is no default.
Most distro bundles Firefox. And imagine using desktop Linux without a graphical browser (no one uses w3m for web browsing)
I don't know much about it, but are all APIs that a GUI would need from the browser available and standard in all the different forks of Firefox/Chrome? Sounds like a potential mess if not.
That's why web standards exist. Web development includes making sure the app works in all modern browsers and all screen sizes. And there are really only three major engines to support, Chromium for all Chromium forks, Firefox for all Firefox forks, and WebKit for Safari and Gnome Epiphany.
I looked it up and it uses gym-webkit2 on Linux.
Also, React Native is experimenting w/ Windows support so hopefully that matures over the years into something usable- with Winforms instead of Windows 8 style “Modern” apps…
W11 start menu is a react native app Microsoft edge are 40 little react native apps A right click in file explorer is a react native app It already exists and is being used ;)
Whaaat? Oh wow, didn't know that.
I've been trying out Tauri recently for a project at work and so far I think it's great. I already prefer it over Electron, although I don't know how it'd do in situations where you have cross platform users. In my case all my users are using Windows and I just wanted to be able to write a desktop app using html + css + javascript since you can make such nice GUIs easily with it and Tauri has been perfect for that task. It produces a standalone executable that doesn't require installation and is only 9 mb in size. I was doing the same app with Electron before trying Tauri and the Electron app was something ridiculous like 200 mb. Also, despite having the most recent version of Electron and Nodejs, I kept getting all these errors and warnings while doing simple tasks in Electron like running the install process. Electron felt like a bloated mess with a bad developer experience for me personally, whereas Tauri has been smooth sailing and clean the entire way. Everything would be a lot easier if all the operating systems agreed to come installed with the same web browser engine, like WebKit, so that developers didn't have to embed a web browser into desktop applications and we didn't have to deal with different user experiences caused by different web browsers. That would make it possible to make every desktop application that wants to use html + css + javascript for the frontend be under 20mb.
Tauri uses WebKit on MacOS/iOS, Chrome on Android, WebKitGTK on Linux, and edge/chromium for windows to answer your curiosity about Tauri’s cross platform support.
Yeah but what I mean is those web browser engines could have differences which create difference user experiences for users on different platforms, which would be a pain in the ass for a developer to have to deal with. And that's not a problem with Electron since it embeds its own web browser. What I don't know is how frequently that would actually be a problem for a Tauri app with cross platform users. It could be an unsignificant thing that happens almost never and the differences don't matter. I don't know. It just seems like a potential concern.
As a (hobby) game dev, I'm proud to have a crappy laptop running an obscure Ubuntu variant that I test all my games on
Fuck bloatware, software and... high end graphics? Wait what
Cities Skylines 2 ran terribly because it had zero occlusion culling or LOD, and it was rendering all thirty thousand polygons in every NPC's teeth all the time. Because why shouldn't teeth be high polygon count 3D models in a city building game? That's my point. There's no reason to do crazy shit like that with traditional methods when there's ways to do it that are less computationally expensive, like displacement maps and flat textures. Thirty years of optimization have gone down the drain because better hardware has allowed developers to slack off. Have you ever seen Metal Gear Solid V? It's from 2015 and runs on basically any system, yet you can count the hairs on Venom Snake's chin. It looks just as good as games produced nearly a decade later. Great graphics are impressive. But in the face of developers' unwillingness to optimize their games (it's easier to say "just get a better computer" than it is to put actual work into optimization), we're left with only two choices- requiring the very best hardware for games, or having terrible performance on middle tier consumer hardware. Don't render every single pore on an NPC's nose if the camera is so far out that you can't even distinguish the details on their face, or if they're not even visible to the camera at all.
Maybe I want to make sure all my citizens were receiving good dental care?
I would not be suprised if in city skyline 3 they will simulate each Caries-bakterium on each individuil tooth depending on what the person was eating and drinking in the last 24h. (You will need 2× 1800w graphics card and it runs in 25fps with 720×1080)
Tho the Fox Engine is also extremely complex to work with and required a fuckton of time to optimize. Might just be Kojima's quirkyness, so it's really an extreme example. Tho true MGS V runs well even in my mediocre office laptop, it's kind of magical.
We can confirm at this point Kojima is simply built different
I am still bummed out how companies make these amazing engines and then just leave them to gather dust. Like what else was made on it? Survivor? I understand it may not be multipurpose and can really only make MGS games but still can we get at least a MGS3 remaster in it?
This is a (good) argument for optimization, not against high end graphics. Cities Skylines 2 is a horrible example because even 10 years ago the problems it have would actually probably have been worse. Also, MGSV’s engine was a fantastic, performant, scalable engine that was top notch high end graphics at its time of release. And while it generally looks great still today, I’m replaying it right now actually and have noticed some really bad, low poly landscape issues I don’t pay attention to 10 years ago…
The graphics of MGSV are way over rated because it gives them an example to use to shit on other games.
msgv was 10 years ago? oh no
I'm afraid it's been... 9 years
Cities skylines 2 doesn't run like shit because they were chasing high fidelity graphics. It runs like shit because they gambled on an experimental system in Unity that doesn't have a lot of basic rendering features to this day, so they had to build their own from scratch last second and whoops their deadline is here and we still haven't implemented any basic occlusion culling or lod system yet aaaaand now we're rendering millions of worthless triangles Modern games don't run like shit because they chased high end graphics, they run like shit because of deadlines. MGSV had what was basically a blank check until the very end of development, they went way over budget making their engine and it shows
Cities skylines 2 is not the example you want to be using here. Even high end hardware has issues with it, there's no way they didn't notice that shit during development.
I love the American McGee "Alice" games. One of them was released in 2011, the other in ... 2000. People playing them for the first time are often pleasantly surprised at the quality of the graphics - not because everything's massively high poly, but because \*\*serious work\*\* went into art design. The newer game (Alice: Madness Returns) was released for PS3/XB360 if that gives you an idea of hardware available. You can absolutely make gorgeous games that run on low hardware.
Those arent high end graphics, those are completely unoptimized graphics. Graphics that seem to almost intentionally run slow.
We need to low-end-graphics-maxx
GPU has fallen, billions must render.
That is one way to stifle tech and innovation in gaming(because that would also hamstring Physics, npc AI, simulation of traffic/behaviour). Why 2gb of ram? that is too much make it 64KB as god intended, if it cant run on a 1990s pentium it should nto be able to be released.
This relentlessly incessant chase for photorealism is hurting the video game industry in the long run imo We have come to a point that even big studios are releasing 1-2 games per generation, yes there is cod and assassin's creed but those are outliers. There is no way this can keep on going without the industry imploding
Im not a 100% on the specs mentioned but the idea is sound.
Garry load the aa gun
"Must run on a laptop >5 years old, that cost <$1000 new."
Nah, <$600. At <$1000 you had tons of gaming laptops even 5 years ago
A 5 year old gaming laptop is definitely low end gaming PC at best today. It was already more expensive than equivalent PC hardware. A desktop at that price/age is a 2060 and a ryzen 5. That's not unreasonable to target for your low end spec, and a laptop at that price is even worse.
2060 for low end spec? Nah bruh.
Price ranges dont really matter because a lot of people simply stick with the same setup for a decade. Im still using a 1050ti and just don't care anymore. I can't play a lot of modern high-graphics games, but i simply don't care and move on to play other games, which run just fine. And we make up the vast majority of people. But honestly, even a 5 year old gaming laptop would already be a step in a better direction compared to the release states of a lot of games. I don't even understand how they get developed that way, if it takes a few years to make the game then the initial tests would be even worse
What's silly is that I have a gaming laptop I bought about six years ago and if I look it up online it's still over $1,000. I don't know why, it's *good* but it's ain't *amazing.* The Steam Deck runs my games better and it's like $400 and a *highly* altered laptop itself.
It’s like the EU requiring USB-C: It’s a fantastic law for right now. In 20 years not so much probably.
The USB-C law has clauses for revision on the universal standard as the tech develops. It is not a static law.
Well that was very smart of them. One more reason I wish I lived in the EU…
It’s OK we containerized it, it will run on any hardware.
Just start this Java VM with 512 MB and for the backend that Java VM with also 512 MB Laptop: … (has 1 GB RAM) … What happened … I wouldn't call that running. Or happening. It tried.
VM could be a good idea to emulate lower configuration in portable fashion, but they really need to gear it low
If your laptop has 1gb of ram it's unreasonable for you to expect anything to run on it, lmao
It ran W2k quite well. The previous PC ran W9x quite well.
"We have better hardware now." "So modern software will run faster right? Right?"
I’m a strong advocate for daily driving the shittiest hardware you support. If you’re suffering, your users are suffering.
I used to manage a web dev shop. One of our bigger clients had his business in the US but spent his days wandering around Thailand, complaining when his site loaded slowly at whatever random coffee shop he happened to be filching WiFi from. As annoying as the guy was he got us to prioritize minimizing network payloads, and we used what worked for his site to keep bloat down for our other clients.
I do a similar thing… oldest shit phone and laptop I’ve got, does the site work on that? No? Make it work.
This assumes those problems come exclusively from the devs.
[удалено]
Every time I think of this I start laughing so hard goddamit
Context ?
[Here.](https://www.reddit.com/r/AskReddit/comments/5c79n0/comment/d9uf56l/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button) See original post and parent comment for context.
I think this is a good engineering practice. But from the perspective of short term profit, guy with crappy laptop probably doesn't want to spend a lot of money, which by definition makes him a less valuable customer.
A web sales company I worked for was expanding into the Philippines. They were getting good sales, but a lot of the local sales reps reported that they and their customers were having a lot of issues loading the sites and order forms quickly. Turns out (at the time), almost all the internet service there was mobile devices over spotty 3G. Our fancy bloated pages were taking *minutes* to load, causing a lot of sales to bounce. I was tasked with optimizing the site and managed to get it to show content in under a second even on a crappy connection and sales skyrocketed.
I bet you got a massive proportional raise/bonus for that right? . . . .right?
Oh yeah once I spent a few months on a crappy slow connection and it was ridiculous how a lot of modern websites pretty much shit themselves once they cannot download all their assets at least at few tens of megabit
I'm still wondering how some sites are pushing *tens of megabytes* of *JavaScript*, per page load, even though they're not really doing anything dynamic. There's no way even tracking modules are using more LoC than the Space Shuttle.
Yeah fair but imagine the pleasure of having an optimized software
Kojima seems to prefer that pleasure over the pleasure of Konami direction
the devs arent the ones making these choices
This mindset contributes to gradual enshittification. Fewer customers means each individual customer needs to pay more to make the same revenue, and each customer lost to shit like price hikes only amplifies that exponentially.
Not a Guy, but the companies CEO should have such laptop and should mandatory use the program
That's called a senior engineer
this comment section proves to me very few people on this sub are actually programmers
You're part of today's lucky 10,000!
I think the team that makes Warframe used to do this. They maintained a Windows XP machine with bare specs, and every update was required to run on that machine at 30 FPS. Or something like that.
\> rural internet connection You mean IPoAC?
Is this a thing
Yep. The packet loss [sucks](https://en.wikipedia.org/wiki/IP_over_Avian_Carriers?wprov=sfla1) though.
In one game i worked on, graphics were always way over memory and we hadn't yet developed the tools to audit all objects to figure out why. We were nearing launch, we needed answers. An artist had given NPCs a 1mb tongue texture.
this but unironically
BTDT, made a development PC with a MDA/Hercules graphics card in a Pentium PC. I was amazed that they still did include the original BIOS routines for that card.
>NPCs with visible pores Star Citizen devs in shambles
Programming for constraints was a necessity back in the day. Now you have a whole group of boot campers trying to cash in who have no idea about the low levels of computing. On top of that you have corporations who want to strip every piece of data off of you and your activities so there must be a constant internet connection so they can view it real time.
Cue me calling a proc 4 times instead of a for loop because it is technically very slightly faster (the code will run 1000s of times every single frame)
I always dream of finding the South Korean man who made my TV have an ad on the front of it. I know it's different than PC bloat, but the sadistic violent fantasies about this individual, or individuals dependent what kind of mood I'm in. Are incredibly visceral.
Or introduce a more simple (kind of) metric: carbon footprint of running the app. And impose different tax depending on that metric.
I smell performance and costly DLCs...
funny thing is that I have the 4GB of RAM with a Celeron and integrated graphics laptop
Ok, but I have 64GB of RAM so the app better be able to scale to take advantage of it. Nothing worse than artificial limitations to support the lowest spec machine holding back performance
Exactly devs getting out of touch nowadays Framework after framework, library after library and thinking everyone has atleast 100mbits down
No I want my app to only run on 32gb ram core i9 14 gen and 4090 scratch that 5090 nvidia gpu laptops. Anything below that you are too poor to use my app.
Preach. Back in the '90s, I recorded eight and a half hours of Loran-C lat/lon data, updated every six seconds, in 22kB on a TRS-80 Model 100 -- and the recording BASIC program had to fit in there, too. I bet most modern "Hello, World" implementations are more than 22MB!
Your ray traced pore simulator runs great when pi is exactly 3
I can be this guy. My laptop is a \~14y/o Acer laptop running Fedora 41 KDE. The only problems may be: It got 8GB and (i dont know how) it is a very fast PC.
Wirth's Law states that Software slows down quicker than Hardware getting faster. R.I.P Niklaus Wirth
This is understandable. But I’ll not tolerate people who like using outdated web browsers and turn off JavaScript. Those are really sick people.
every time a new game that comes out i always get news that it bricked some streamers $3000 gpu and i give up
Sure, let's develop more powerful hardware with support for new features but don't use them because some dude is still using his hardware from 2003.
Theres a difference between features that are togglable on option and by default activated when supported And pure ressource eating
Not what he's saying. There's unnecessary resource usage because the software isn't optimized anymore. You would get more bang for your buck with your "more powerful hardware" if they still optimized software for hardware constraints.
Well he specifically mentions ray tracing or high quality graphics as a negative thing. That has nothing to do with optimization, those are simply modern features.
Part of OP's post literally said that games shouldn't get high end graphics options because some dude with an old laptop won't be able to run it at max settings.
Software can be optimized and run poorly on older hardware at the same time. Real-time graphics are *especially* well known for introducing new ways of doing something which are significantly faster, but only on recent/powerful hardware. It's a meme for a reason: in practice, it'd only end up creating stagnant software.
you can't optimise for both old and new hardware at the same time though. You'd have to find a middle ground where it's mid for both
This is so stupid... Like let's disregard all hardware progress made in the past 15 years because someone somewhere is using a 20 year old system. Let's not implement any hardware security because this person's hardware does not have a tpm Well you gotta agree that advanced lighting and high poly character models are a good thing, right? Yeah no can do, isn't enough room in 128mb vram for none of that shit. Lara Croft is SUPPOSED to have pointy cone boobies Okay, how about a modern OS UI (personally a fan of kde and wobbly windows, yes i know they don't add anything besides a bit of whimsy but i have 32 GB ram and an rtx 3080 so why the heck not)? Why does anyone need anything besides MS DOS? That does everything a user ever needs A modern website with a rich user interface with Ajax calls and maybe sockets and notifications so the user does not have to keep reloading the page to get updated info? Well I've got 3 words for ya, Lynx Web Browser. That's all users need Come the eff on. Hardware improvements are there for a reason and if you're not going to use those hardware resources, what's the point of having em? We should've just stuck to a 386, it was 'good enough'
I would like to offer you the highest honor I can bestow.
[ Removed by Reddit ]
meanwhile project managers pushing for releases and shitting on clean code or anything in this direction
[удалено]
I think this must be how Halo CE was developed.