• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: January 11th, 2024

help-circle
  • nowhere in the history of language has “there should be such a thing” meant or even implied “making such a thing is easy”

    I know its hyperbole but you can’t possible back that statement up.

    if anything it implies the opposite.

    It doesn’t, but i agree it didn’t really imply the difficulty was high either.

    I wasn’t saying the reply was correct, i was stating the intended meaning (at least as i see it).


    To answer to your original post, design platforms with version control exist.

    Some use git under the hood, some don’t, most don’t require you to understand git to use them.

    Hopefully that saves you some time as now you don’t have to build the platform from scratch.



  • So, benefit of the doubt time.

    That’s some mental gymnastics in there but let’s see if we can get it.

    So the reply isn’t actually suggesting you create the platform for designers, they are pointing out that there is a lot more to competent platform/software design than it seems, so try it yourself and find out.

    If it turns out you do in fact have the answers, great, we now have the platform you described.

    Chances are you’ll find out just how difficult it is to do what you are suggesting and realise that implying someone could “just” create a platform for designers isn’t particularly realistic.



  • i consider this specific example to also be an issue of language, which is in itself a construct.

    Murder as a word has meaning based in law, which is another construct.

    If you were to switch out “murder” for “killing” the outcome remains the same (cessation of life by another party) but the ethical and moral connotations are different.

    Some people use murder when they mean killing and vice versa which adds a layer of complexity and confusion.

    Though all of that could just be me venturing into pedant country.



  • The examples fit irony i suppose but that’s a very broad assumption of nationality for it to apply to the comment you are replying to.

    There could be people who are not American who also disagree with your approach.

    Regardless, question answered, thank you.




  • Sure, when you reach a point that you don’t have better options to achieve the desired goal (for whatever metric you define as ‘better’) then killing is on the table by the sounds of it.

    All we need now is an agreement on the threshold.

    I’m assuming you’ll concede that individual killing comes before mass killing, in the hierarchy of options.

    So, once this threshold is reached then, according to your logic, you are morally allowed to kill in defence ( and i assume pre-emptive defence, given the “They are won by stopping the enemy‘s ability to act.” statement ).

    So going back to your original statement, it’s entirely possible to kill an individual and still believe in your definition of ‘believe in universal human rights.’ ?

    Provided the correct conditions are met, ofc.



  • Rights need to balanced against each other in practice of course.

    So contradiction is possible as i have said and balance would require contextual interpretation, in practice.

    Absolute statements such as :

    Once a war has started, killing is morally acceptable, not before.

    and

    You don’t kill people for their ideological beliefs, but to stop their ability to act and remove them from power.

    Can be contradictory, depending on context.

    I wasn’t challenging your interpretation, though i do think it’s naive and idealistic to the point of impracticality, i was pointing out that your statements could be considered contradictory.

    While I’m at it, i missed a false dichotomy as well :

    Wars aren’t won by killing soldiers. They are won by stopping the enemy‘s ability to act.

    Those things are not mutually exclusive.

    You can find that in international humanitarian law.

    That’s a large amount of text to sift through, if you could give me a hint to where it specifies moral authority before and after an official declaration of war i’d appreciate it.






  • Depends on the team.

    On paper what you’re “supposed” to do is iterate through gameplay mechanisms and scenarios by building up the bare minimum needed to get a feel for it, then once you have something viable you proceed further along the development process.

    In reality it really depends heavily on context, sometimes you find a particular scenario works fine standalone but not as a part of the whole, or some needed balancing change elsewhere breaks the fun of something established, late additions can also cause this.

    but again that depends heavily on the type of game, rpg’s are more sensitive to balancing changes than racing sims for example.

    Specifically we’d usually evaluate the tradeoff between how much it doesn’t work and how much work it is to “fix” it, sometimes it’d get cut completely, sometimes it’d get scaled back, sometimes we’d re-evaluate the feature/scenario for viability and make a decision after that re-evaluation and sometimes we’d just bite the bullet and work through it.

    Over time you get a bit more cautious about committing to things without thinking through the potential consequences, but sometimes it just isn’t possible to see the future.

    I understand the realities of managing a project like that, at the same time these kinds of things are known upfront to a degree and yet people always seem surprised that the cone of uncertainty on a project like that is huge.

    As i said, i have no problem with re-use, i have a problem with saying re-use is “essential” to stopping crunch, like the management of a project like that isn’t the core of the problem.


  • Apologies for the delay, my instance is having problems with communities so i can’t reply with that account.

    To answer the question, not anymore.

    The crunch culture was a big part of me leaving.

    Honestly it’s not that different in type from non-game dev houses, the difference is in the magnitude.

    I understand why these things happen, the reasons just aren’t good enough for me.

    Poor planning compounds with ridiculous timeframes to create an almost immutable deadline to deliver unrealistic goals.

    The problem is, they’ll jump right back in to the next project and make exactly the same mistakes. At what point does it stop being mistakes and starts being “just how things are done”.

    One of the main reasons this works at all is that they take young idealistic programmers who want to work in their dream industry and throw them into a cult of crunch where everyone is doing it so it must be ok or this is the price of having my dream job.

    it’s certainly not all studios and it seems to have gotten marginally better at the indie to small-medium houses but it’s prevalent enough that it’s still being talked about.