Yeah, I have a guess that it didn’t even fullly understood the prompt behind this slop either…
Another crap meme made with AI, yay, I love AI slop
The client wants to drag and drop their own personalized excel file with no guaranteed formatting or column order or data contract in order to import their data into our system <3
Needs more AI to randomly guess what the columns might be
Do we have the same client?
Everyone has and is that client.
Yeah but when I’m like that it’s justified
Surprised there’s no one in the comments going bat shit crazy that this was made by AI. Are we not doing that anymore?
You just had to wait 2 more hours for that.
Well, the table sucks. I can’t draw and I could do better.
You know what we, in the industry, call a detailed specification fo requirements detailed enough to produce software? Code.
Just wait until QA gets a hold of it
AI slop image, for this gag?
How queer, I must report to my supervisor posthaste!
I had a client once explain to me that his request for the 75% redesign of his mobile app would be simple because “it’s just 3 pages”
That was the exact quote
I know that was hardly related to the post, but it reminded me of that and I needed to vent to my therapist (aka strangers on Lemmy)
I feel you. Just ended coding “a little special case” that resulted in dozens of files changed, all because I refused to make it with dirty hidden hack, and that was a clear-cut technical if-branching even, no vague ideas
Talking to a client is times that amount of hurdle
Managers about to find out the hard way that all the requirements are in the brains of those they laid off.
I’m sure coding bootcamp and AI will turn them into leet hax0rs.
Definitely happening at at least one major company I’m familiar with.
Requirements and everything else.
I feel this in my bones. Even before the recent round of restructuring we’ve had a significant about of turnover. Our infrastructure is a massive rube golberg machine with multiple houses of cards built on top of it. Institutional knowledge was never written down and it has been leaving the company at an accelerating rate over the past 5 years. Tons of “new blood” making lots of assumptions on how things work is resulting in… humorous end results.
Whoa whoa, hold on there! You can’t expect a product manager to come up with such detailed specs!
I am a product manager that loves coming up with detailed specs. How else will I actually get what I want? If you care about some specific behavior/outcome you must specify it. This logic is lost on my leadership.
AI Project Manager: Create a button on a webpage that, when clicked, displays an alert saying “Hello World!”
AI Programmer: “What a sensible requirement! Here you go.”
AI Billing Department: “Project completed, that’ll be 10 million dollars.”
Client AI Payments Department: “Sounds right, paid!”AI:
AI can’t replace programmers right now, but I’ve said all through my software dev career that our ultimate goal is to eliminate our jobs. Software will eventually be able to understand human language and think of all the right questions to ask to turn “Customer wants a button that does something” into an actual spec that generates fully usable code. It’s just a matter of time. Mocking AI based on what it currently can’t do is like mocking airplanes because of what they couldn’t do in the 1920s.
I had a number of points to discuss, but they pale before this:
Software will eventually be able to understand human language
First, someone surely must have tried to code it, but I never heard of any system like that. Second and more important: anyone understands how we understand? And how the distance between understanding and communicating is covered? Someone? Anyone?
And before some smart person tries for the thousand’s time this “but computers will get bettah” shit of argument: even with the whole task of putting it to code aside, we know shit about how we think, understand and speak, that’s coming from me having Master’s degree in linguistics
Yes, the main problem with developing AI is that we really don’t understand how we think. Current AI doesn’t understand anything, it just imitates human output by processing a vast amount of existing output. But we do know a lot more now about how we think, understand and speak than we did a hundred years ago, and as a linguist you know this work isn’t standing still,. Compare it with genetics - 70 years ago we didn’t even know about DNA, and now we can splice genes. The fact that there’s still a lot of baseline work to do shouldn’t cast doubt on the goal, should it?
Oh yes it should. We have spent thousands of years looking at these things, and look where we are
For almost all of those thousands of years, no tools existed to analyze the actual mechanics of brain function. The development of all sciences has been exponential in the last couple centuries. I’ll be here if you decide you want to converse like someone with a master’s degree instead of a mediocre high school student scrolling lemmy on the toilet.
Lol. Good luck, mister exponential science
I just want to point out that this whole AI thing started with people not understanding how it works. I get your point although I think it’s stumbling into progress rather than understanding that will be a method until we do.
How about… AI replaces government officials! A lot cheaper. Might actually get things right. And how could it fuck up any worse than what we have?
“Goverment bad durr durr”
Good, I hope so. Whatever puts an end to the hipster, activist dev is a solid win.