• 0 Posts
  • 530 Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle

  • One other factor that I think is an issue with motion blur: the modeling of shifting gaze in video games often isn’t fantastic, due to input and output device limitations.

    So, say you’re just looking straight ahead in a game. Then motion blur might be fine – only moving objects are blurred.

    But one very prominent place where motion blur shows up is when the direction of your view is changing.

    In a video game, especially if you’re using a gamepad, it takes a while to turn around. And during that time, if the game is modeling motion blur, your view of the scene is blurred.

    Try moving your eyeballs from side to side for a bit. You will get a motion-blurred scene. So that much is right.

    But the problem is that if you look to the side in real life, it’s pretty quick. You can maybe snap your eyes there, or maybe do a head turn plus an eye movement. It doesn’t take a long time for your eyes to reach their destination.

    So you aren’t getting motion blur of the whole surrounding environment for long.

    That is, humans have eyes that can turn rapidly and independently of our heads to track things, and heads that can turn independently of our torsos. So we often can keep our eyes facing in one direction or snap to another direction, and so we have limited periods of motion blur.

    Then on top of that, many first person shooters or other games have a crosshair centered on the view. So aiming involves moving the view too. That is, the twin-stick video game character is basically an owl, with eyes that look in a fixed position relative to their head, additionally with their head fixed relative to their torso (at least in terms of yaw), and additionally with a gun strapped to their face, and additionally, with a limited rate of turn. A real life person like that would probably find motion blur more prominent too, since a lot of time, they’d be having to be moving their view relative to what they want to be looking at.

    Might be that it’d be better if you’re playing a game with a VR rig, since then you can have – given appropriate hardware – eyetracking and head tracking and aiming all separate, just like a human.

    EDIT: Plus the fact that usually monitors are a smaller FOV than human FOV, so you have to move your direction of view more for situational awareness.

    https://old.reddit.com/r/askscience/comments/gcrlhn/what_fov_do_humans_have_like_in_video_games_can/

    Human field of view is around 210 degrees horizontally. Each eye has about 150 degrees, with about 110 degrees common to the two and 40 degrees visible only to that eye.

    A typical monitor takes up a considerably smaller chunk of one’s viewing arc. My recall from past days is that PC FPS FOV is traditionally rendered at 90 degrees. That’s actually usually a fisheye lens effect – actual visible arc of the screen is usually lower, like 50 degrees, if you were gonna get an undistorted view. IIRC, true TV FOV is usually even smaller, as TVs are larger but viewers sit a lot further away, so console games might be lower. So you’re working with this relatively-small window into the video game world, and you need to move your view around more to help maintain situational awareness; again, more movement of your direction of view. A VR rig also might help with that, I suppose, due to the wide FOV.


  • tal@lemmy.todaytoAsk Lemmy@lemmy.worldPeople who leave motion blur on in games, why?
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    17 hours ago

    Motion blur is a win if it’s done correctly. Your visual system can make use of that blur to determine the movement of objects, expects it. Move your hand quickly in front of your eyes – your fingers are a blur.

    If you’ve ever seen something filmed at a high frame rate and then played back at a low frame rate without any sort of interpolation, it looks pretty bad. Crystal-clear stills, but jerky.

    A good approximation – if computationally-expensive – is to keep ramping FPS higher and higher.

    But…that’s also expensive, and your head can’t actually process 1000 Hz or whatever. What it’s getting is just a blur of multiple frames.

    It’s theoretically possible to have motion blur approaches that are more-efficient than fully rendering each frame, slapping it on a monitor, and letting your eye “blur” it. That being said, I haven’t been very impressed by what I’ve seen so far in games. But if done correctly, yeah, you’d want it.

    EDIT: A good example of a specialized motion blur that’s been around forever in video games has been the arc behind a swinging sword. It gives the sense of motion without having to render a bazillion frames to get that nice, smooth arc.









  • Plus, even if you manage to never, ever have a drive fail, accidentally delete something that you wanted to keep, inadvertently screw up a filesystem, crash into a corruption bug, have malware destroy stuff, make an error in writing it a script causing it to wipe data, just realize that an old version of something you overwrote was still something you wanted, or run into any of the other ways in which you could lose data…

    You gain the peace of mind of knowing that your data isn’t a single point of failure away from being gone. I remember some pucker-inducing moments before I ran backups. Even aside from not losing data on a number of occasions, I could sleep a lot more comfortably on the times that weren’t those occasions.


  • tal@lemmy.todaytoAsk Lemmy@lemmy.worldGenerating art with the same style - Help
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    8 days ago

    You’re probably better off asking on !imageai@sh.itjust.works.

    If you want to try there, I can throw some ideas out.

    Asklemmy, despite the oft-confusing community name that generates a lot of these sort of things, isn’t really intended as a general “ask any question” community, but for “thought-provoking” questions. Their Rule 5 excludes stuff like this.

    The mods tend to delete stuff like this; I’ve had a few questions that I’ve spent time answering and then had the post deleted with the answers, which is kinda frustrating if you’ve put effort into an answer.

    If you ask there, I’d suggest indicating which system you used to generate the image, as it’ll affect the answer.


  • Well, for me, the selling points are:

    • Versus earlier versions of USB, it’s reversible. This isn’t a game changer, I guess, but it’s definitely nice to not have to fiddle plugs around all the time.

    • I don’t know if it’s the only form of USB that does USB PD – I’d guess not – but in practice, it seems to be pretty strongly associated with USB PD. Having USB PD isn’t essential, but it makes charging larger devices, like laptops, a lot more practical. I can lug around a power station that doesn’t need to have an embedded inverter.

    I still feel that it’s kind of physically small and weak compared to USB A. That’s an okay tradeoff for small portable devices that don’t have the space for larger connectors, but I’m kinda not enthralled about it on desktop. I worry more about bending connectors (and I have bent them before).

    So for me, I’d say that it’s definitely nice, but not really in a game changing sense. I could do the things it can do in somewhat-worse ways prior to USB-C.