

I’m glad this is still alive. It’s a good idea. I keep thinking about doing the same for TCL.
I do wonder if Bash is perhaps just a bit too insane to make this really feasible though.
Reversed range support
This is a misfeature.


I’m glad this is still alive. It’s a good idea. I keep thinking about doing the same for TCL.
I do wonder if Bash is perhaps just a bit too insane to make this really feasible though.
Reversed range support
This is a misfeature.


Why? Microsoft gives them a ton of money for CI and infrastructure. Unless they are having serious technical issues I don’t see why they would move to a more expensive and probably less well integrated CI provider.
The audacity to be nostalgic for Eclipse!


I don’t think you can blame anyone except him for not understanding that AI sometimes hallucinates. Hell basically every AI tool makes you read that before you start using it. I’m normally very reticent to blame users for using things incorrectly, but if it took him 3 hours to realise it was wrong here then I have to say that’s on him.
Come on has anyone else ever persevered with a hallucinated answer for 3 hours before realising it was a hallucination? Longest it’s taken me is like 5 minutes, and that’s only for things that aren’t easily googleable.


I suspect in the real world it’s frustrating enough for restaurants that it wouldn’t have worked out.
You’re pretty much tricking restaurant workers into one of those awful voice-based phone trees.
Plus there are so many things that can actually happen when you try to book a table on the phone - they don’t have exactly what you want but can offer you this time instead… they only have outside seating available… etc. etc.
Plus, just having a proper online booking form is clearly a better option and not totally uncommon these days.


This is silly, and not in an interesting way.


Actually chars() is pretty simple - it’s just UTF-8 decoding which is elegant and simple.
The complexity is all around unicode, not UTF-8.


Interesting. But can’t you do basically the same thing with @nonnull annotations? I remember using something like that a decade ago when I last wrote Java.


IMO automated changelogs like these are not especially useful. Better than no changelog I guess, but nowhere near as good as a proper changelog. But proper changelogs take actual effort.


One example is creating an interface for every goddamn class I make because of “loose coupling” when in reality none of these classes are ever going to have an alternative implementation.
Sounds like you’ve learned the answer!
Virtual all programming principles like that should never be applied blindly in all situations. You basically need to develop taste through experience… and caring about code quality (lots of people have experience but don’t give a shit what they’re excreting).
Stuff like DRY and SOLID are guidelines not rules.


AI AI blah blah AI.
Also why is HCL supposedly the 9th most popular “programming language” (which it isn’t anyway)?


It’s a fairly inevitable reaction to cancel culture. This was predicted and warned against when left-wing cancel culture was at its height, but people didn’t listen. Now we have right-wing cancel culture instead.


I wouldn’t recommend the Gang of Four book. Many of the design patterns they espouse are way over complicated from the days of peak OOP. You know, FactoryFactoryVisitor stuff. Usually best avoided.


Yeah, I use Claude/ChatGPT sometimes for:
I haven’t got around to setting up any of that agentic stuff yet. Based on my experience of the chat stuff I’m a bit skeptical it will be good enough to be useful on anything of the complexity I work on. Find for CRUD apps but it’s not going to understand niche compiler internals or do stuff with WASM runtimes that nobody has ever done before.
I think you misread my comment. I didn’t say it was a non-commercial project.
I still don’t think it’s fair to say it’s not open source. It clearly is. What you mean is it’s not a non-commercial project.
I think it’s a perfectly reasonable license. You can also use it for free with closed source projects, except embedded projects (where most of the money is), which I think is generous.
I don’t think everything has to be completely free. I’d much rather they had a viable business model and actually continue existing than just fizzle out because they have no funding source. Writing a high quality GUI toolkit is an enormous task so it’s not really going to happen otherwise.
As much as I’m following egui, Xylem, Dioxus, Makepad etc. and hope they succeed I’d put my money on Slint being the first to make a Rust GUI toolkit of the same quality as Qt.


This video confuses at least three different concepts - quantum uncertainty, ternary computers, and “unknown” values.
Ternary computers are just not as good as binary computers. The way silicon works, it’s always going to be much much slower.
“Unknown” values can be useful - they are common in SystemVerilog for example. But you rarely just have true, false and unknown, so it makes zero sense to bake that into the hardware. Verilog has 4 values - true, false, unknown and disconnected. VHDL has something like 9!
And even then the “unknown” isn’t as great as you might think. It’s basically poor-man’s symbolic execution and is unable to cope with things like let foo = some_unknown_value ? true : true. Yes that does happen and you won’t like the “solution”.
High level programming concepts like option will always map more cleanly onto binary numbers.
Overall, very confused video that is trying to make it sound like there’s some secret forgotten architecture or alternative history when there definitely isn’t.


Yeah but theses regulatory burdens can only be born by mega-corporations so even though it is extra work for them, it still benefits them.
Is Deno not convenient and fast? I am also interested in knowing why I would want to use Bun over Deno.