Hacker Newsnew | past | comments | ask | show | jobs | submit | 7thaccount's commentslogin

The Windows 95-XP taskbar is good. Everything else has been downhill.

I use Trinity Desktop on Linux because it's basically the same as the Windows 95-XP taskbar interface, and has no plans to change.

To be fair, modern KDE has more-or-less the same taskbar.

And the taskbar is also not optimal. Having text next to the icons is great, but it means you can only really have, like, 4 or 5 applications open and see all their titles and stuff. Which is why modern windows switched to just icons - which is much worse, because now you can't tell which app window is which!

The optimal taskbar, imo, is a vertical one. I basically take the KDE panel and just make it vertical. I can easily have 20+ apps open and read all their titles. Also, I generally think vertical space is more valuable for applications, and you get more of it this way.

It also allows me to ungroup apps. So that each window is it's own entry in the taskbar, so one less click. And it works because I can read the window title.


> modern KDE has more-or-less the same taskbar.

More or less, yes; Trinity Desktop is basically KDE 3. But KDE has added on a lot of other cruft since then that has no value to me.

> Having text next to the icons is great, but it means you can only really have, like, 4 or 5 applications open and see all their titles and stuff.

That's what multiple virtual desktops are for. My usual desktop configuration has 8. Each one has only a few apps open in it.

> The optimal taskbar, imo, is a vertical one.

I do this for toolbars in applications like LibreOffice; on an HD aspect ratio screen it makes a lot more sense to have all that stuff off to the side, where there's more than enough screen real estate anyway, than taking up precious vertical space at the top.

But for my overall desktop taskbar, I've tried vertical and it doesn't work well for me--because to show titles it would have to be way too wide for me. The horizontal taskbar does take up some vertical space at the bottom of the screen, but I can make that pretty small by downsizing it to either "Small" or "Tiny".


I think you're exaggerating a little, but aren't entirely wrong. The Internet has completely changed daily life for most of humanity. AI can mean a lot of things, but a lot of it is blown way out of proportion. I find LLMs useful to help me rephrase a sentence or explain some kind of topic, but it pales in comparison to email and web browsers, YouTube, and things like blogs.

My limited understanding (please take with a big grain of salt) is that they 1.) sell mainframes, 2.) sell mainframe compute time, 3.) sell mainframe support contracts, 4.) sell Red hat and Redhat support contracts, and 5.) buy out a lot of smaller software and hardware companies in a manner similar to private equity.

This is going on all over again.

Agentic AI really is changing things. I've had a complete change of heart about it. It's good enough now to boost productivity MASSIVELY for devs.

I think this is one of those things that can be situationally useful, but also come with huge risks to the majority of users.

They come up with tons and tons of products like Google Glass and Google+ and so on and immediately abandon them. It is easy to see that there is no real vision. They make money off AdSense and their cloud services. That's about it.

Google does abandon a lot of stuff, but their core technologies usually make their way into other, more profitable things (collaborative editing from Wave into Docs; loads of stuff from Google+; tagging and categorizing in Photos from Picasa (I'm guessing); etc)

It annoyed me recently that they dropped support for some Nest/Google Home thermostats. Of course, they politely offered to let me buy a replacement for $150.

LLMs are useful tools, but certainly have big limitations.

I think we'll continue to see anything be automated that can be automated in a way that reduces head count. So you have the dumb AI as a first line of defense and lay off half the customer service you had before.

In the meantime, fewer and fewer jobs (especially entry level), a rising poor class as the middle class is eliminated and a greater wealth gap than ever before. The markets are going to also collapse from this AI bubble. It's just a matter of when.


I tried to get into Clojure, but a lot of the JVM hosted languages require some Java experience. Same thing with Scala and Kotlin or F# on .NET.

The early tooling was also pretty dependent on Vim or Emacs. Maybe it's all easier now with VSCode or something like that.


It doesn't require any Java but the docs do at times sort of assume you understand the JVM to some extent - which was a bit frustrating when first learning the language. It'll use terms like "classpath" without explaining what that is. However nowadays with LLMs these are insignificant speedbumps.

If you want to use Java you also don't really need to know Java beyond "you create instances of classes and call methods on them". I really don't want to learn a dinosaur like Java, but having access to the universe of Java libs has saved me many times. It's super fun and nice to use and poke around mature Java libs interactively with a REPL :)

All that said I'd have no idea how to write even a helloworld in Java

PS: Agreed on Emacs. I love Emacs.. but it's for turbo nerds. Having to learn Emacs and Clojure in parallel was a crazy barrier. (and no, Emacs is not as easy people make it out to be)


None of this even remotely true. I've gotten into Clojure without knowing jackshit about Java, almost ten years later, after tons of things successfully built and deployed, still don't know jackshit about Java. Mia, co-host of 'Clojure apropos' podcast was my colleague, we've worked together on multiple teams, she learned Clojure as her very first PL. Later she tried learning some Java and she was shocked how impossibly weird it looked compared to Clojure. Besides, you can use Clojure without any JVM - e.g., with nbb. I use it for things like browser automation with Playwright.

The tooling story is also very solid - I use Emacs, but many of my friends and colleagues use IntelliJ, Vim, Sublime and VSCode, and some of them migrated to it from Atom.


It might not be a problem for you, but it has been for many. I did start by reading through 3 Clojure books. The repl and the basic stuff like using lists is all easy of course, but the tooling was pretty poor compared to what I was used to (I like lisp, but Emacs is a commitment). Also, a lot of tutorials at the time definitely assumed java familiarity, especially with debugging java stack traces.


> It might not be a problem for you, but it has been for many

Do you have a habit of referring to yourself in plural, or do you typically like to generalize things based on your personal experiences?

I personally know many Clojurists who never had problems you're describing - hundreds of people. Sure, that could be the case of survivorship bias, perhaps I just don't befriend people who struggled with getting into Clojure specifically in a way you're describing. But like they say: "Those who are willing to make the effort will find the solutions. Those who aren't will find the excuses."

Clojure undeniably had challenges in the past, and still has some today. But not the things you're talking about. This is literally not an exaggeration - it's as easy as installing Calva extention for VSCode - that's all one needs to mess around with Clojure.


I've had this discussion here on HN several times over the years. Lots of comments from others have pointed out similar experiences. I'm guessing your experience was more positive and that's great to hear.

I did point out that maybe things had changed a good bit (literally said maybe VSCode made that easier now as it has for other tools) and tried to make it clear that my experience was a bit dated.

As far as excuses go, I don't see how that's relevant. I just pointed out I had issues with a steep learning curve when I was seriously considering it many years ago along with other languages that are hosted on the JVM (Scala, Kotlin) or .NET (F#). Nothing against those languages, but all the tutorials and even many of the books at the time would frequently borrow from the host language in weird ways. Like I'd have to use some random Java library and when it didn't work, had no idea how to troubleshoot why it wasn't there and I didn't want to have to go learn Java first.

I own at least two books on F# and talked with some prominent authors personally and they admitted it was really geared towards intermediate or greater C# users who wanted to move over to functional programming. I could have stuck with it, but decided to stick with other tools.

Clojure certainly is nice and I wanted to take advantage of it...it just ended up not being as ergonomic for my needs as I had hoped.


Perfect solution for doing analysis on tables. Wes McKinney (inventor of pandas is rumored to have been inspired by it too).

My problem with APL is 1.) the syntax is less amazing at other more mundane stuff, and 2.) the only production worthy versions are all commercial. I'm not creating something that requires me to pay for a development license as well as distribution royalties.


The Windows 95 left tile was basically perfect. A lot of Linux distros have something similar. It allows you to quickly survey the useful programs. There is no further perfection.

A close second in my book was the PlayStation 3 User Interface. Gloriously intuitive. PlayStation 4 and the new XBox are god awful. I can't wait to buy a Steam Machine and never have to search for my freaking game again like on the XBox monstrosity that has all kinds of crapware on it. Is frustrating your users good for business?

Windows 8 and the Ubuntu of around that time both had absolutely bonkers interfaces. Is it better for a phone? Sure....but I'm not using a phone. Windows 8 was so bad I honestly can't believe it wasn't blocked by upper management. It made all the previous customer/user knowledge worthless. I literally had to memorize all these Window Key + letter commands just to shut down the computer and find the My Documents.


You're correct in a sense, but glossing over a very real problem. You still need some kind of knowledge of how to use a desktop computer for a lot of jobs in the workforce. If all you know is how to click on some apps, then you're at a competitive disadvantage. There are plenty of horror stories of Gen Z being just as bad as Boomers at tech (like can't figure out how to copy files) and that should concern everyone on HN. I'm sure some grow up with raspberry pi computers, but 99% are probably iPad only kids.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: