tbf all good programmers are good at math. Not classic arithmetic necessarily, but at the very least applied calculus. It’s a crime how many people used a mathematical discipline every day, but don’t think they’re “good at math” because of how lazer focused the world is on algebra, geometry and trig as being all that “math” is.
Serious question; how does Calculus apply to programming? I’ve never understood.
PID control is the classic example, but at a far enough abstraction any looping algorithm can be argued to be an implementation of the concepts underpinning calculus. If you’re ever doing any statistical analysis or anything in game design having to do with motion, those are both calculus too. Data science is pure calculus, ground up and injected into your eyeballs, and any string manipulation or Regex is going to be built on lambda calculus (though a very correct argument can be made that literally all computer science is built of lambda calculus so that might be cheating to include it)
Does it apply to interpolation for animation and motion?
Motion yes, but I have no idea about the mathematics of animation (sorry)
Graphics programming is the most obvious one and it uses it plenty, but really any application that can be modeled as a series of discrete changes will mostly likely be using calculus.
Time series data is the most common form of this, where derivatives are the rate of change from one time step to the next and integrals are summing the changes across a range of time.
But it can even be more abstract than that. For example, there’s a recent-ish paper on applying signal processing techniques (which use calculus themselves, btw) to databases for the purposes of achieving efficient incremental view maintenance: https://arxiv.org/abs/2203.16684
The idea is that a database is a sequence of transactions that apply a set of changes to said database. Integrating gets you the current state of the database by applying all of the changes.
Computers are just big calculators so to program them you need calculus.
Lotta infinite sums in loops
good physics/graphics engine require calculus
How?
Again, legit question.
If you write them yourself. Then you actually need a bit of math.
But claiming that you need math skills as a programmer because some kinds of programs need you to know maths is like claiming every programmer needs to know a lot about logistics because some people write software for warehouses.
A senior firmware engineer said to the group that we just have to integrate the acceleration of an IMU to get velocity. I said “plus a constant.” I was fired for it.

I’m something of a scientist myself
Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.
Basic programming principles? No clue. Data structures? Nope.
We were once having a discussion about the limitations of transistors and dude’s like “what’s a transistor?” ~_~#
Tbh, as a dev knowledge of transistors is about as essential as knowledge about screws for a car driver.
It’s common knowledge and in general maybe a little shameful to not know, but it’s really not in any way relevant for the task at hand.
Maybe for dev knowledge, but computer science? The science of computers?
What kind of cs degree did you get where you learned about electrical circuits. The closest to hardware I’ve learned is logic circuit diagrams and verilog.
I mean, I graduated over 20 years ago now, but I had to take a number of EE courses for my CS major. Guess that isn’t a thing now, or in a lot of places? Just assumed some level of EE knowledge was required for a CS degree this whole time.
In my uni they kinda just teach java. There is one mandatory class that’s in C and one that’s in mips assembly tho.
Everyone used AI when I took those classes. By the end of the year they were still having trouble on groupchat with syntax stuff.
In my own uni’s coursework the closest we get are some labs where students breadboard some simple adder circuits, which we do just to save them from embarassing gaps in their knowledge (like happened in the inital comment). It doesn’t add much beyond a slightly better understanding of how things can be implemented, if we’re being honest.
I learned about transistors in Informatics class in highschool. Everything from the bottom up, from the material that makes a transistor possible to basic logic circuits sr flip flops, and, or, xor, addition, to the von-neumann-architecture, a basic microprocessor and machine code and assembly.
I don’t have a degree
Well, computer science is not the science of computers, is it? It’s about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.
We all know how great we IT people are at naming things ;)
My BS in CS took its roots down to CMOS composition of logic gates and basic EE, on the hardware side, and down to deriving numbers and arithmetic from Boolean logic / predicate calculus, on the philosophy side. Then tied those up together through the theoretical underpinnings of computation and problem solving, like a trunk, and branched back out into the various mainstream technologies that derived from all that. It obviously all depends on the program at the school of choice, I suppose, and I’m sure it’s evolved over the years, but it still seems important to have at least some courses that pull back the wizard’s curtain to ensure their students really see how it’s all just an increasingly elaborate, high-tech version of conceptually simple (in function) machinery carrying out fundamental building blocks of logic.
Anyway, I’m going to go sniff my own cinnamon roll scented farts while gazing in the mirror, now.
We did the same thing, going so far as to “build” a simple imaginary CPU. It was interesting but ultimately dead knowledge.
I built an emulator for that CPU, which the university course took over and used for a few years for the course. But after that I never did anything with logic gates or anything like that.
I got into DIY electronics lateron as a hobby, but even then I never used logic gates and instead just slapped a cheap microcontroller on to handle all my logic needs.
I do use transistors sometimes e.g. for amplification, but we didn’t learn anything about that in university.
In the end it feels like learning how to theoretically mine sand when studying to become an architect. Interesting, but also ultimately pointless.
Computational theory would be a better name, but it overlaps with a more specific subset of what is normally called CS.
We could also just call it Software Engineering. That’s at least the job everyone gets with a Computer Science degree.
It wasn’t named by IT people, though. It was named by academics. And it’s not about using computers, it’s about computing. Computer science is older than digital electronics.
Mhm, and those academics were no IT people and had nothing to do with computers?
Let’s fact-check that.
Computer Sciences as an academic course was first created by IBM at the Columbia University in 1946. Because IBM had made their first commercial computer two years prior and wanted to have people who could operate it and who could continue to develop it.
If you want someone to know about the physical properties of transistors, find an electrical engineer.
Ok, but he didn’t know what a transistor is. Like I get not knowing the mechanics or chemistry of it, but to literally not know it or how it applies to a computer boggles my mind.
Is that not the difference between a computer science and a computer engineering degree?
I was partnered with that guy for one class in grad school. We were working on a master’s degree in software engineering, and the assignment was analysis and changes to an actual code base, and this mofo was asking questions and/or blanking on things like what you mention. I can’t remember the specifics but it was some basic building block kind of stuff. Like what’s an array, or what’s a function, or how do we send another number into this function. I think the neurons storing that info got pruned to save me the frustrating memories.
I just remember my internal emotional reaction. It was sort of “are you fucking kidding me” but not in the sense that somebody blew off the assignment, was rude, or was wrong about some basic fact. I have ADHD and years ago I went through some pretty bad periods with that and overall mental & physical health. I know the panic of being asked to turn in an assignment you never knew existed, or being asked about some project at work and just have no idea whatsoever how to respond.
This was none of those. This was “holy shit, this guy has never done anything, how did he even end up here?”
deleted by creator
If a C- is enough to pass Analysis of Algorithms, then a Computer Science degree can make me a Computer Scientist. :P
You need C++ for computer science, though!
looks weird without the clevage
I literally have no idea what this picture means, and at this point I’m too afraid to ask.
The typical holder of a four-year degree from a decent university, whether it’s in “computer science”, “datalogy”, “data science”, or “informatics”, learns about 3-5 programming languages at an introductory level and knows about programs, algorithms, data structures, and software engineering. Degrees usually require a bit of discrete maths too: sets, graphs, groups, and basic number theory. They do not necessarily know about computability theory: models & limits of computation; information theory: thresholds, tolerances, entropy, compression, machine learning; foundations for graphics, parsing, cryptography, or other essentials for the modern desktop.
For a taste of the difference, consider English WP’s take on computability vs my recent rewrite of the esoteric-languages page, computable. Or compare WP’s page on Conway’s law to the nLab page which I wrote on Conway’s law; it’s kind of jaw-dropping that WP has the wrong quote for the law itself and gets the consequences wrong.
I meant the guy in the picture, but thanks anyway
I have been coding since I was 10 years old. I have a CS degree and have been in professional IT for like 30 years. Started as a developer but I’m primarily hardware and architecture now. I have never ever said I was a computer scientist. That just sounds weird.
Yeah you’d really only say it on the theoretical side of things, I’ve definitely heard it in research and academia but even then people usually point to the particulars of their work first
IT stooge != science Sorry fellas.
“Engineer of Information”, please 😎
Surely you must be a master of linear algebra and Euclidean geometry
I mean, I am applying various kinds of science but I’m not actually doing any science so I’m not thinking about myself as a scientist. What I do is solving problems - I’m an engineer.
My ex boss describes himself as such. King of the dickheads.
You are right man
good they escaped early










