This genie must’ve read or watched Brewster’s Millions.
This genie must’ve read or watched Brewster’s Millions.
I saw it at the cinema and vaguely remember enjoying it well enough. It’s not a great movie, but it’s not awful, either. I didn’t know that it was supposed to be terrible; it looks like reviewers gave it a slightly better than average score.
I don’t expect ever to watch it a second time, if that helps.
Lara Croft and the Cradle of Life, though… All I can remember about it now is that afterwards, my friends and I agreed that we should’ve trusted our instincts and just walked out after about 30 minutes.
That’s still newer than any of my daily-use laptops that are all running full-featured Linux distros just fine. I got 'em all cheap secondhand, and just pumped up the RAM (12-16GB) and installed SSDs.
Did you read all the way to the end of the article? I did.
At the very bottom of the piece, I found that the author had already expressed what I wanted to say quite well:
In my humble opinion, here’s the key takeaway: just write your own fucking constructors! You see all that nonsense? Almost completely avoidable if you had just written your own fucking constructors. Don’t let the compiler figure it out for you. You’re the one in control here.
The joke here isn’t C++. The joke is people who expect C++ to be as warm, fuzzy, and forgiving as JavaScript.
Yeah, I’m sure that almost all of us have felt this way at one time or another. But the thing is, every team behind every moronic, bone-headed interface “update” that you’ve ever hated also sees themselves in the programmer’s position in this meme.
Since you seem earnest, probably play_my_game or possibly gamedev.
I reserve further comments until I know whether you posted this in this community: a) deliberately but seriously, b) deliberately and sarcastically, or c) by accident.
Any time I need to learn something about JS, I go to W3Schools to wrap my head around the basics, then over to MDN for current best practice.
That was my first take as well, coming back to C++ in recent years after a long hiatus. But once I really got into it I realized that those pointer types still exist (conceptually) in C, but they’re undeclared and mostly unmanaged by the compiler. The little bit of automagic management that does happen is hidden from the programmer.
I feel like most of the complex overhead in modern C++ is actually just explaining in extra detail about what you think is happening. Where a C compiler would make your code work in any way possible, which may or may not be what you intended, a C++ compiler will kick out errors and let you know where you got it wrong. I think it may be a bit like JavaScript vs TypeScript: the issues were always there, we just introduced mechanisms to point them out.
You’re also mostly free to use those C-style pointers in C++. It’s just generally considered bad practice.
Every time I see yet another obscure game/platform article or video, I realise that I’ve once again forgotten how little most people delve into the history of their creative media. I’m teaching myself about Soviet clones and niche Japanese systems that came out before I was born, and some 20-something self-proclaimed video game historian is releasing a video titled “The most obscure game that NO-ONE remembers” and it’s about Legacy of Kain or Space Quest or Sly Cooper or some other million-selling franchise that just hasn’t had a new release in the last 5-10 years.
I’m waiting for these guys to get old enough to start seeing “world’s most obscure game” videos about Minecraft and Fortnite.
AIX is pretty obscure as a gaming platform, though, I’ll give you that.
As someone who has often been asked for help or advice by other programmers, I know with 100% certainty that I went to university and worked professionally with people who did this, for real.
“Hey, can you take a look at my code and help me find this bug?”
(Finding a chunk of code that has a sudden style-shift) “What is this section doing?”
“Oh that’s doing XYZ.”
“How does it work?”
“It calculates XYZ and (does whatever with the result).”
(Continuing to read and seeing that it actually doesn’t appear to do that) “Yes, but how is it calculating XYZ?”
“I’m not 100% sure. I found it in the textbook/this ‘teach yourself’ book/on the PQR website.”
Most people use the term “Hungarian Notation” to mean only adding an indicator of type to a variable or function name. While this is one of the ways in which it has been used (and actually made sense in certain old environments, although those days are long, long behind us now), it’s not the only way that it can be used.
We can use the same concept (prepending or appending an indicator from a standard selection) to denote other, more useful categories that the environment won’t keep straight for us, or won’t warn us about in easy-to-understand ways. In my own projects I usually append a single letter to the ends of my variable names to indicate scope, which helps me stay more modular, and also allows me to choose sensible variable names without fear of clashing with something else I’ve forgotten about.
“If you were making food, would you use onion powder?”
As a half-joking response to this half-joking admission, I got started with the Usborne programming books as a kid, and they laid some excellent foundations for my later study. They’re all available online for free these days, so grab an emulator and user manual for your 80s 8-bit home computer of choice, and dive in!
Thr34dN3cr0 wrote (14:12 5/17/2019):
Does anyone have a way to fix this in the latest version? I’ve been looking all day but none of the answers I’ve found work.
Thr34dN3cr0 wrote (14:48 5/17/2019):
nvm figured it out.
“If you wish to be a writer, write.”
Epictetus delivered this burn over 1900 years ago.
Re: the Acceptance stage.
Years ago I worked at a family-run business with a good working environment. The staff were once told a story of how, earlier in the company’s history, a manager made a mistake that caused the company a substantial monetary loss.
The manager immediately offered their resignation, but the owner said to them, “Why would I let you go now? I’ve just spent all this money so you could learn a valuable lesson!”
So yeah, generally, most managers’ reaction to accidentally deleting vital data from production is going to be to fire the developer as a knee-jerk “retaliation”, but if you think about it, the best response is to keep that developer; your data isn’t coming back either way, but this developer has just learned to be a lot more careful in the future. Why would you send them to a potential competitor?
Yes, I think that most of us realized from some of the self-aware wording that this is a parody. But like many parodies it’s a real trope taken to a silly extreme, so we’re talking about users who fit that trope (including ourselves, sometimes!).
Oh hell, you gave me a PTSD flashback!
It’s the late 90s. My mother suddenly discovers File Explorer on her refurbished commodity Wintel box and decides that all this messy clutter has to go. Never mind that the drive was 80% empty when delivered and I didn’t expect her to come close to filling it before it was replaced. Fortunately I had already backed up everything that looked important or interesting.
One day she calls from the office, “I don’t need this ‘Windows’ any more, do I?”
“What? Wait! Don’t do anything!” I walk in and she’s got C:/Windows highlighted and the cursor is hovering over “Delete”.
“I already have Windows installed on this computer, so I don’t need this any more, do I?” Spoken more as a statement than a question. It took several minutes of forced calm explanation to get her to accept that this “Windows” directory WAS the Windows that’s installed on the machine. She still wasn’t happy that she could see it in File Explorer, though. So untidy!
deleted by creator