Every new game feels like I need to spend hours learning how it works before I get to having fun, when as a working parent I might only have 30 minutes here or there where I’m able to play. When I get back to a game after a couple of weeks off, I can’t remember what I was doing, or what the controls are. It’s just not fun.
Furthermore, every time I turn my console on, everything needs an update in order to be played. So there’s a 15-20 minute wait to get to any sort of entertainment.
Contrast this to the OG Xbox/PS2 era - I’d turn the console on and be having fun within a minute or two in a game that was easy to understand. I don’t think this was due to a lack of depth in the games either. They generally just seemed to have an “easy to learn, hard to master” aspect to them that doesn’t feel present today.
Obviously this is a huge generalisation. But the cumulative effect is that it’s switched me off gaming completely. Unless something is considered a true masterpiece, I won’t even bother.
My Xbox is packed away for now. I expect the next time I’ll turn it on will be for GTA 6.
+1, I fall into this category. It's tough.
But is it a problem for the gaming industry? How many sales can they expect from the time poor?
I manage to still play, by choosing conceptually simple games (puzzle, platformer, sports, GTA, some FPS), and playing on the Steam Deck. Portability + instant resume works well for this.
This makes it easy for me to log on, do 30 minutes of gaming and then log off and make some incremental progress on the game.
(My experience here is mostly with Nintendo and indie games on the Switch, for reference)
Doesn't have the sales of any Nintendo handheld that has earned its place on history books, and remains to be seen for how long Microsoft will tolerate Proton as Windows/DirectX translation layer.
> It's only the Xbox and Xbox One, PS4/5 where AMD CPU got used.
i think maybe forgetting about gpus?gamecube → wii → wiiu = amd/ati gpu
xbox 360 (amd/ati gpu) → one → series = amd gpu/cpu
ps4 → ps5 = amd gpu/cpu