- 2 Posts
- 80 Comments
Social justice thief is basically Robin Hood.
Social justice assassin is Luigi.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible2·19 hours agoPython has come a long way in recent years, I remember when android switched to an ahead of time compiler for its java.
Yeah, stuff improves a lot, but prejudices often stay the same. Java is really fast nowadays. That didn’t use to be the case.
My current project is trying to create a cool fork of mobian for the pinephone with overclocks and some other stuff, right now I’m editing the debt trees to get about 50% more performance for roughly the same battery life, out of the pinephone.
With some other things, a bigger battery, and a custom modem firmware that can downclock the CPU in it, I’m getting 2% battery drain per hour with the screen off.
Yeah, with a lot of work, really cool things can be done.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible1·1 day agoTechnically speaking, interpreted languages aren’t compiled at all. That was the original definition.
Nowadays, there’s hardly any clasically interpreted language. All major interpreted languages compile to bytecode, which is then run via a kind of VM that interprets that language. But many languages (like Java) go even farther and compile that bytecode into native machine code at runtime.
Being interpreted, though, is an implementation feature, not a language feature. So, for example, if you use CPython, Python is compiled into bytecode when you first run a script. The bytecode is then stored and used the next time you run the same script as long as it hasn’t been changed in the meantime. You can also force the compilation to bytecode and only ship the bytecode.
But if you use Pypy instead of CPython, it is a regular compiler that compiles into native machine code. No bytecode and/or interpretation in the process at all. This increases the performance of pure Python by around 5x, according to some benchmarks.
But that kind of benchmarking is kinda flawed anyway because most real-life programs don’t only use pure Python.
There’s a thing called Cython (not to be confused with the Python interpreter called CPython), which allows C-Code to be called from Python. Cython is used by almost all modules that contain performance-critical code, and it’s just as fast as using C directly.
In most applications, you have a concept called “hot code”. That’s specific code paths that take up the vast majority of the code execution time (usually 95+% of the time are spent on just a few code paths). So when optimizing Python code, you figure out which these are and then you use Cython to implement them in C (or use a 3rd party module that already does that).
Then you use Python only as a “glue code” that covers all the low-usage code.
In that use case, Python is only marginally slower than C.
Slow Python programs are usually an issue of optimization, not an issue of the language itself.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The demise of Flash didn't bring any big HTML5/JS equivalent for watching animations; fast internet and better video compression made those types of animations become raster videos as well11·1 day agoWhen you have no valid argument, become offensive.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible2·1 day agoThis is a very apt description.
Lua is like using duct tape for everything. It worked to save the Appollo mission so it should work anywhere.
Visual Basic is like using the tools you found in a cardboard box when you cleared out the house when your uncle died. You’ll get random, super specialized things, but if one of these old tools breaks, you just have to continue with the remains.
Cobol is like using the toolshop at the Renaissance fair, only that the tools are original, not reproductions.
Brainfuck is the like your kid’s toy tool box. Yes, there’s something that kinda looks like a plastic hammer, but good luck using it for anything other than role playing.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible1·1 day agoYou forgot
CommunistRussian.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible2·1 day agoThis is it. OOP is a first-year CS student who thinks he’s edgy.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible1·1 day agoI’m talking mostly about corporate software development. You know, the kind of projects that run for 20 years with multiple teams and half a dozen cycles of “let’s outsource everything” - “oh, it sucks, let’s insource everything” - “oh, it’s expensive, let’s outsource everything again”. Doing that in C without major issues is rare.
In that kind of context, safety is everything and performance doesn’t matter.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible1·1 day agoYou know why functional programming languages don’t have side-effects?
Because nobody uses them.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible1·1 day agoMakes sense what you are saying.
When it comes to programming languages, I like to think of them as tools for a job. All languages have advantages and downsides.
For server software Java is by far the best (especially if it’s supposed to scale). For web frontends it’s TypeScript. For very simple scripts that mostly call other tools it’s bash. For more complex scripts, non-performance-critical data processing and small projects it’s Python. For microcontroller work, C. For working on more performant microcontrollers C+Lua. For tests Groovy is surprisingly helpful. For game development GDScript or whatever your chosen environment supports.
The rest is just syntax. It doesn’t really matter whether I use curly braces or indentation.
I do like the old if-endif block style, but sadly that doesn’t really exist in mainstream languages anymore. Lua is the only thing that’s kinda similar, but they only use “end”, negating the advantage of being able to easier see where the “for” ends in a sea of “ifs”.
I guess bash does something similar too, but “fi” and “esac” really break my fingers (and then they don’t even do “elihw”).
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible3·2 days agoIt’s pure money and capitalism and nothing else.
Companies don’t pay for computation time and memory on customer devices.
Companies barely pay for computation time and memory on their devices.
So why should they care?
The only thing that could limit that would be if e.g. electricity was taxed really high, but then again, electricity isn’t even what makes computation expensive. Hardware is much more expensive.
I’m glad Linux exists and it’s still written in C.
While that’s true of the Kernel, it’s not true of other components. Gnome shell or Cinnamon, for example, are mostly JavaScript. Almost half of the KDE Plasma code is QML.
I want to be part of the solution and not the problem, but I understand survival and keeping a job is important to someone like you.
I’m sure, in your day job there’s also things that don’t work, things where corners are cut and things where you could do much better if you had infinite time, energy and budget. There’s not a lot of people who leave work every day knowing that they performed absolute perfection every single day.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible1·2 days agoPerfect C is faster than perfect Python, same as perfect assembly is faster than perfect C.
But in the real world we don’t write perfect code. We have deadlines, we have constantly shifting priorities, we have non-technical managers dictating technical implementations. We have temporary prototype code that ended up being the backbone of a 20 year project because management overpromised and some overworked developer had to deliver real fast. We have managers still believing in the mythical man month (“If one woman can make a baby in 9 months, 9 women only need a single month to make one”) and we have constant cycles of outsourcing and then insourcing again.
With all that garbage going on we end up with a lot of “good enough for now”, completely independent of “laziness” or “low-skill” of developers. In fact, burnout is incredibly common among software developers, because they aren’t lazy and they want to write good software, but they get caught in the gears of the grind until they get to a mental breakdown.
And since nobody has the time to write perfect code, we end up with flawed and suboptimal code. And suboptimal assembly is much worse than suboptimal C, which is again much worse than suboptimal Python.
If your Python code is suboptimal it might consume 10x as much RAM as it needs. If your C code is suboptimal, it memory-leaks and uses up all RAM your PC has.
If your Python code is buggy, something in the app won’t work, or worst case the app crashes. If your C code is buggy, some hacker just took over your PC because they exploited a buffer overflow to execute any code they want.
The main issues with software performance are:
- Management doesn’t plan right and developers need to do with what they have
- Companies don’t want to spend incredible amounts of money on development to make things perfect
- Companies want products to be released in time
- Customers aren’t happy with simple things. They want pretty, fancy things
- Customers don’t want to pay for software. In today’s money, Win95 cost around €500 and Office cost around €1000. Would you want to spend that? Or do you expect everything to be free? How much did you pay for all the software on your phone?
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible3·2 days agoNot exactly. In C you have to do everything by hand. There’s a ton of weird, badly fitting parts and stuff that doesn’t really make sense.
With Python stuff just works. I worked as a Python developer for almost 10 years (switched to Java and Kotlin in the meantime). There have been hardly any real WTF moments over the whole time.
I use C/C++ for my hobby stuff (I do a lot of hobby microcontroller development) and there’s tons of weird gotchas that I would have never imagined (e.g. a missing return statement not getting flagged by the compiler, which it really should, but instead semi-crashing the program at runtime).
Python is slower, but as long as you have a project where performance doesn’t matter, it’s day and night. It’s like working with something that was purposely designed by someone who has somewhat of an idea what they are doing, compared to C/C++ which feels like something that just happened.
In my hobby stuff I now added a Lua interpreter for a kind-of app system, and while Lua is an incredibly bare-bones language it still runs laps around C/C++ when it comes to usability.
Maybe to make the metaphor of the dude before me more poignant: C feels like your granddad’s kit car that you inherited. C++ feels like you got the same kit car after the neighbourhood crackhead had it for a few years and bolted all sorts of weird accessories onto it and did a lot of “tuning”.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible2·2 days agoPython is 34 years old already. That means, someone who was already working as a programmer when Python came out would have to be about 54 years or older now.
I wonder why people still think it’s the hot new thing.
When Python came out, C was 19 years old. So Python is almost twice as old now as C was when Python came out.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible2·2 days agoI fear not so. To them everything outside of the USA is the same thing.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible2·2 days agoThe first language I learned had a if-endif-style syntax. Curly braces SUCK compared to that.
The only downside that whitespaces have is that they often get lost when copy-pasting them.
It’s virtually impossible to share a small python snippets over some messengers. Other than that I prefer whitespaces over curly braces.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The C programming language is like debating a philosopher and Python is like debating someone who ate an edible2·2 days agoThere’s one big difference between hobby work and professional work: If you do hobby stuff, you can spend as much time on it as you want and you are doing what you want. So likely, you will do the best you can do with your skill level, and you are done when you are done, or when you give up and stop caring.
If you do professional work, there’s a budget and a deadline. There’s a dozen things you need to do RIGHT NOW and that need to be finished until yesterday. There’s not one person working on things, but dozens and your code doesn’t live for weeks or months but for years or decades and will be worked on by someone when you are long gone. It’s not rare to stumble upon 10 or 20 years old code in most bigger applications. There are parts of the Linux kernel that are 30 years old.
Also in professional work, you have non-technical managers dictating what to do and often even the technical implementation. You have rapidly shifting design goals, stuff needs to be implemented in crunch time, but then gets cancelled a day before release. Systems are way more complex than they need to be.
I am currently working on the backend of the website and app for a large retail business. The project is simple, really. Get content from the content managers, display a webside with a webshop, handle user logins and loyalty program data. Not a ton of stuff.
Until you realize:
- The loyalty program is handled by what used to be a separate company but got folded into our company.
- The webshop used to be run by the same team, but the team got spawned out into its own organisation in the company.
- The user data comes from a separate system, managed by a team in a completely different organization unit in the company.
- That team doesn’t actually manage the user data, but only aggregates the user data and provides it in a somewhat standardized form for the backends of user-facing services. The data itself lives in an entirely separate environment managed by a different sub-company in a different country.
- They actually don’t own the data either. They are just an interface that was made to provide customer data to the physical stores. They get their customer data from another service, managed by another sub-company, that was originally made to just handle physical newsletter subscriptions, 20 years ago.
We are trying to overhaul this right now, and we just had a meeting last week, where we got someone from all of these teams around a table to figure out how different calls to the customer database actually work. It took us 6 hours and 15 people just to reverse-engineer the flow of two separate REST calls.
If you see bugs and issues in a software, that’s hardly ever due to bad programmers, but due to bad organizations and bad management.
I don’t mean you can literally understand everything about a computer, just that you can understand everything you need to in order to do 99% of things and this isn’t some crazy thing. You would obviously use openGL or vulkan or direct X to access the GPU instead of writing binaries.
This is exactly what the software crisis is, btw. With infinite time and infinite brain capacity, one could program optimally. But we don’t have infinite time, we don’t have infinite budget, and while processors get faster each year, developers just disappointingly stay human.
So we abstract stuff away. Java is slower than C (though not by a ton), mostly because it has managed memory. Managed memory means no memory management issues. That’s a whole load of potential bugs, vulnerabilities and issues just removed by changing the language. Multitasking is also much, much easier in Java than in C.
Now you choose a framework like Spring Boot. Yes, it’s big, yes you won’t use most of it, but it also means you don’t need to reimplement REST and request handling. Another chunk of work and potential bugs just removed by installing a dependency. And so on.
Put it differently: How much does a let’s say 20% slow down due to managed memory cost a corporation?
How much does a critical security vulnerability due to a buffer overflow cost a corporation?
Hobby development and professional development aren’t that similar when it comes to all that stuff.
squaresinger@lemmy.worldto Showerthoughts@lemmy.world•The demise of Flash didn't bring any big HTML5/JS equivalent for watching animations; fast internet and better video compression made those types of animations become raster videos as well11·2 days agoApparently you were neither a great flash dev nor a great software dev, but just someone who talks a lot.
Being offensive doesn’t make your point any more correct.
This is another thing where hobby and professional development diverge.
For professional development, freedom just means unmaintainable code. For projects that run for a longer time you want everything to be as standardized as possible. People 10 years from now will need to understand your code, even if they live in a different country, went to a different university and haven’t seen any code from your organization before they join the project.
Cool and clever tricks will likely cause more trouble in the future than they will ever be worth right now. You write code once, but you will keep reading and re-working it over and over again.
This is exactly the issue here. When you stumble upon code that uses an obscure feature like that, it’s a “wtf moment” and it will likely result in something being used wrong and something causing a bug. We don’t want that.
That’s why close to every professional project uses a linter, which blocks you from using problematic patterns and illegible code. If you use C with a linter, it will force you to format your code in a certain way as well.
If you just DIY your own small projects and discard them before they become old, code style doesn’t matter. But if you ever looked at the code of one of your old projects and it took you a while to understand what you did there, then that’s the result of bad code style.