• 0 Posts
  • 182 Comments
Joined 1 year ago
cake
Cake day: June 4th, 2023

help-circle








  • We also submitted our very first game in this jam and used Godot.

    We had no point where the engine got in the way, it just let us do what we wanted.

    The only issue we did encounter is that the shader cache does not seem to work. Every time a level with the simplest shader in the world loads, we have to wait 5 seconds. Which is really annoying when debugging. From what I read it seems like Godot is supposed to be faster on second launch? Not sure what went wrong there.

    We also could not export for web because we used C#, but it appears that feature is coming soon™.



  • Domi@lemmy.secnd.metoLinux Gaming@lemmy.mlJust Switch Over
    link
    fedilink
    arrow-up
    12
    arrow-down
    3
    ·
    1 month ago

    I’ll stick to windows. I don’t want to deal with those people."

    That’s a strange conclusion to come to, installing an OS doesn’t come with the obligation to deal with anyone.

    I like to play games on Steam but that doesn’t mean I have to deal with the atrocity that is the Steam forums.




  • Do you use C# since you’re coming from Unity?

    You can use GetNode<CustomNode>() or GetChild<CustomNode>() to find the node you need just like in Unity. CustomNode will be the type of your script or if there is no script attached to your node you can use the builtin types as well (e.g. Node3D).

    Once you have the node you want, you can either use Godots builtin functions SetMeta() and GetMeta() to set and get metadata on a node or use the C# way and access the public properties set on that CustomComponent class directly.

    I don’t use GDScript but I assume you have the same methods available there.


  • So what’s the big fuggin’ problem here? That Intel won’t use the term “recall”?

    Would you say the same thing about a car?

    “We know the door might fall off but it has not fallen off yet so we are good.”

    The chances of that door hurting someone are low and yet we still replace all of them because it’s the right thing to do.

    These processors might fail any minute and you have no way of knowing. There’s people who depend on these for work and systems that are running essential services. Even worse, they might fail silently and corrupt something in the process or cause unecessary debugging effort.

    If I were running those processors in a company I would expect Intel to replace every single one of them at their cost, before they fail or show signs of failing.

    Those things are supposed to be reliable, not a liability.




  • Domi@lemmy.secnd.metoLinux Gaming@lemmy.worldHDR Confusion
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 months ago

    But why does it end up washing out colors unless I amplify them in kwin? Is just the brightness absolute in nits, but not the color?

    The desktop runs in SDR and the color space differs between SDR and HDR, meaning you will end up with washed out colors when you display SDR on HDR as is.

    When you increase the slider in KDE, you change the tone mapping but no tone mapping is perfect so you might want to leave it at the default 0% and use the HDR mode only for HDR content. In KDE for example, colors are blown out when you put the color intensity to 100%.

    Why does my screen block the brightness control in HDR mode but not contrast? And why does the contrast increase the brightness of highlights, instead of just split midtones towards brighter and darker shades?

    In SDR, your display is not sent an absolute value. Meaning you can pick what 100% is, which is your usual brightness slider.

    In HDR, your display is sent absolute values. If the content you’re displaying requests a pixel with 1000 nits your display should display exactly 1000 nits if it can.

    Not sure about the contrast slider, I never really use it.

    Why is truehdr400 supposed to be better in dark rooms than peak1000 mode?

    Because 1000 nits is absurdly bright, almost painful to watch in the dark. I still usually use the 1000 mode and turn on a light in the room to compensate.

    Why is my average emission capped at 270nits, that seems ridiculously low even for normal SDR screens as comparison.

    Display technology limitations. OLED screens can only display the full brightness over a certain area (e.g. 10% for 400 nits and 1% for 1000 nits) before having to dim the screen. That makes the HDR mode mostly unuseable for desktop usage since your screen will dim/brighten when moving large white or black areas around the screen.

    OLED screens simply can’t deliver the brightness of other display technologies but their benefits easily make it worth it.