• 0 Posts
  • 108 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle








  • I came here to say the same.

    People in the technical career track spend most of their time making software, one way or another (there comes a point were you’re doing more preparation to code than actual coding).

    As soon as you jump into the management career track it’s mostly meetings to report the team’s progress to upper management, even if you’re supposedly “technically oriented”.

    Absolutelly, as you become a more senior tech things become more and more about figuring out what needs to be done at higher and higher levels (i.e. systems design, software development process design) which results in needing to interact with more and more stakeholders (your whole team, other teams, end users, management) hence more meetings, but you still get to do lots of coding or at least code-adjacent stuff (i.e. design).


  • Aceticon@lemmy.worldtoProgrammer Humor@programming.devAny Volunteers
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    2 months ago

    The inability to detail the idea all the way down to the level were something concrete can be made from it kills it well before the lack of coding skills.

    It’s like what separates having an idea for a book and writting an actual book that is enjoyable to read: there is no “knowing how to code” barrier in there and yet most people can’t actually pull it off when they try or it ends up shallow and uninteresting.


  • Don’t take this badly but it sounds like you’ve only seen a tiny slice of the software development done out there and had some really bad experiences with Agile in it.

    It’s perfectly understandable: there are probably more bad uses of Agile out there than good ones and certain areas of software development tend to be dominated by environments which are big bloody “amateur hour every hour of the day, every day of the year” messes, Agile or no Agile.

    That does however not mean that your experience stands for the entirety of what’s out there trumphing even the experience of other people who also work in QA in environments where Agile is used.


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Agile was definitelly taken in with the same irrationality as fashion at some point.

    It’s probably the best software development process philosophy for certain environments (for example: were there are fast changing requirements and easy access to end users) whilst being pretty shit for others (good luck trying to fit it at a proceess level when some software development is outsourced to independent teams or using for high performance systems design) and it eventually mostly came out of that fad period being used more for the right things (even if, often, less that properly) and less for the wrong things.

    That said the Agile as fad phase was over a decade ago.


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Agile made Management, who had actual Senior Designer-Developers and Technical Architects designing and adjusting actual development processes, think that they had this silver bullet software development recipe that worked for everything so they didn’t need those more senior (read more expensive and unwilling to accpet the same level of exploitation as the more junior types) people anymore.

    It also drove the part of the Tech Industry that relies mainly on young and inexperienced techies and management (*cough* Startups *cough*) to think they didn’t need experienced techies.

    As usual it turned out that “there are no silver bullets”, things are more complex, Agile doesn’t work well for everything and various individual practices of it only make sense in some cases (and in some are even required for the rest to work properly) whilst in others are massive wasting of time (and in some cases, the usefull-wasteful balance depends on frequency and timing), plus in some situations (outsourced development) they’re extremelly hard or even impossible to pull at a project scope.

    That said, I bet that what you think is “The Industry” is mainly Tech companies in the US rather than were most software development occurs: large non-Tech companies with with a high dependency of software for competitive advantage - such as Banks - and hence more than enough specific software requirements to hire vast software development departments to in-house develop custom solutions for their specific needs.

    Big companies whose success depends on their core business-side employees doing their work properly care a lot more about software not breaking or even delaying their business processes (and hence hire QA to figure out those problems in new software before it even gets to the business users) than Tech companies providing software to non-paying retail users who aren’t even their customers (the customers are the advertisers they sell access to the eyeballs of those users) and hence will shovel just about anything out and hopefully sort out the bugs and lousy UX/UI design through A/B testing and user bug-reports.


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 months ago

    Yeah.

    Any good software developer is going to account for and even test all the weird situations they can think of … and not the ones they cannot think of as they’re not even aware of those as a possibility (if they were they would account for and test them).

    Which is why you want somebody with a different mindset to independently come up with their own situations.

    It’s not a value judgment on the quality of the developer, it’s just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 months ago

    “Wrong way” for whom?

    In Software Development it ultimatelly boils down to “are making software for the end users or are you making it for yourself?”

    Because in your example, that’s what ultimatelly defines whose “wrong” the developer is supposed to guide him/herself by.

    (So yeah, making software for fun or you own personal use is going to follow quite different requirement criteria than making software for use by other people).



  • I’ve actually worked with a genuine UX/UI designer (not a mere Graphics Designer but their version of a Senior Developer-Designer/Technical-Architect).

    Lets just say most developers aren’t at all good at user interface design.

    I would even go as far as saying most Graphics Designers aren’t all that good at user interface design.

    Certain that explains a lot the shit user interface design out there, same as the “quality” of most common Frameworks and Libraries out there (such as from the likes of Google) can be explained by them not actually having people with real world Technical Architect level or even Senior Designer-Developer experience overseeing the design of Frameworks and Libraries for 3rd party use.



  • The good ones: design and adjust software development processes, standards for cross-project functionality and reusability and in general try and improve at a high level the process of making, maintaining and improving software in a company.

    The bad ones: junior/mid-level software design with a thick layer of bullshit on top to spin it as advanced stuff.

    If you want to see bad software architecture, just look at most of Google’s frameworks and libraries.