Yeah, weird-shaped windows are definitely not something that should make a comeback. Just because you could doesn't mean you should.
> Today, all Windows desktop apps look the same as they are the same; they are all built on crap React, Electron, electronbun, and Tauri browser wrappers that mimic the real Desktop apps.
Desktop apps should look the same because they should use the OS GUI framework; that has nothing to do with React and Electron. I can't quite understand this argument; being webview based enables applications to look different from each other, like websites do, not similar. If they still do look similar, that's a good thing.
> The point was usually not usability. It was identity.
Yeah. And usability is sort of a big deal. Applications that implement their own widgets or color schemes or nonstandard shapes usually pay zero attention to usability or accessibility. They almost invariably lack all standard affordances and disregard the standard UX guidelines.
Also, ironically the applications with the most "identity" today tend to be control panels and other accessories by HW manufacturers bundled with device drivers, and they also happen to be the crappiest, most terrible bloatware that an average user is likely to encounter.
Not only that, but I think that Electron leads to the opposite problem: all apps look and behave differently, they don't follow platform guidelines, they look out of place.
I never had a problem with that. I want a specific application to behave the same no matter where I run it. I do not want my muscle memory for how to use an application to be confused by an application not looking or behaving the way I am used to when moving to a different platform.
Of course all the applications bundled with a specific OS should be designed to work the same and work well together. It still makes sense to have guidelines and standard widgets in a system. But I prefer very much any third-party multi-platform app to be identical everywhere I run it.
Not to defend Electron. There are many native frameworks that work the way I prefer, looking the same across platforms.
I use probably 70% Windows, 20% iPad, 5% Meta Quest 3 [1], and 5% MacOS -- for the latter though it is mostly "test that something works on MacOS" and "tech support for the computer the family uses".
I like web-based applications that behave the same everywhere. Personally I feel the MacOS widget set is a touch old fashioned, a little ugly and gauche. I can see though why somebody might like the MacOS terminal better than CMD.EXE. The dominant theme on Windows is that Windows has several widget sets that aren't consistent but the average user doesn't notice or care -- probably the worst area is the settings dialogs which seem to be mostly migrated to a Metro-based design lately. I was afraid before they wouldn't finish that migration before they churned to another framework but I think they've stopped the churn.
The best windows applications, in my mind, steal from web technology -- like they are either using some kind of HTML-based UI or they are made by people who grew up making web applications and reproduce those patterns w/ the desktop widget sets.
[1] I've got some web applications I wrote that run perfectly on the MQ3, especially after I got target sizes up to WCAG AAA level and it is fun to put the headset on and crash out on the couch and get things done
> I've got some web applications I wrote that run perfectly on the MQ3, especially after I got target sizes up to WCAG AAA level and it is fun to put the headset on and crash out on the couch and get things done
One way to have better text in VR for 2D content is to make use of OpenXR composition layers:
I hate applications that don’t feel native to their OS, and the common solutions for looking the same across platforms (Java and Electron) fail at being good on every platform. Their attempts to replicate native functionality results in Java applications that can’t reliably resize columns or sort tables or accept focus when a dialog is open or a dozen other annoying failures I’ve seen, while looking they were imported from a failed OS that never succeeded for good reason. Electron results (and Microsoft’s work alike) results in slow bloated applications that can’t handle focus either, or flip from screen to screen properly, or even be discoverable because there controls aren’t anything real native applications use.
I don't think platform guidelines that anyone listens to have been a real thing for a long time. Even between apps released by MS there is little or no consistency at times, things that should be part of standard OS provided chrome like title-bars are a random mess - good luck guessing what has input focus sometimes, particularly with multiple monitors, as you unlock or switch vdesktop, without clicking to make sure.
I keep thinking of writing something that detects the top-most app window and draws an obvious box around it.
Native macOS developers respected Apple's Human Interface Guidelines for a long time, but even that's declining now that everyone needs to work around all the problems with Liquid Glass.
>> I keep thinking of writing something that detects the top-most app window and draws an obvious box around it.
I would use this in a heartbeat. With Windows 10/11 I usually have the option to apply a garish accent color to the active window active. Nowadays, more and more apps don't use native window frames anymore, so that option works less and less.
The W11 task bar with its barely legible indicators doesn't help either.
On a big ultra-wide display with a few windows open, I sometimes struggle to see which one is active.
> > I keep thinking of writing something that detects the top-most app window and draws an obvious box around it.
> I would use this in a heartbeat.
I may one day get around to it. Of the many projects on my “will probably never actually happen” list¹ it is one of the smallest. I did something similar to add other decorations to windows back in my just-post-Uni days². Walking the process list, getting the hWnd(s) you were interested in, and for there the window dimensions, was fairly trivial and it no doubt still is.
----
[1] I mention them here where relevant, in the hopes that someone else will see the ideas and be inspired to implement the them in an open form so I don't have to :-)
[2] ~win2000 era, I was playing in Delphi at the time
Do platforms even follow their own guidelines? And if they do, are those guidelines good? Microsoft doesn't seem to care about UI/UX at all, Apple's UI/UX quality gets worse each year, and Linux is all over the place with each distro doing its own thing. What guidelines are those apps supposed to follow?
Looking at the current state of things, I think it's good that apps tend to do whatever they think is best for their use case. Also, most people don't switch between 100 different apps all the time.
> Yeah, weird-shaped windows are definitely not something that should make a comeback. Just because you could doesn't mean you should.
My opinion here is the exact opposite of yours. Make computers cool again! They used to look like an alien spaceship, now everything looks like paperwork.
I'm with you. It's fine if your opinion is "software = tool" and you personally want it to be utilitarian and basic, but plenty of us desire our software interfaces to have personality and customization. The answer is that we should have the option to choose.
I use computers for a plenty of fun, but those programs always were the digital equivalent of garden gnomes. Utterly kitschy. Just having a basic sense of aesthetics doesn't imply thinking that computers should be dull or just for work.
Accessibility is really important as well, as there's different laws and regulations covering people's rights here too. Modern cross-platform GUI frameworks (as heavy as they can be) have no issues supporting screen readers and HiDPI for people with sight difficulties.
Agreed. I remember seeing quite a few non-standard designs in the days of Vista, especially when Microsoft was heavily promoting the Windows Presentation Foundation framework and using XAML for UI design.
The problem with setups like this is that the moment you need to resize them, place them in a specific spot, or move them to a larger or smaller monitor, they tend to scale terribly and end up causing all kinds of “death by a thousand cuts” issues.
It would help if you had seething more specific to say about usability rather than blindly defending the bland conformity: for example, if your music player app looks like your physical old round CD music player, not a rectangle, how exactly does it hurt usability?
> Applications that implement their own widgets or color schemes or nonstandard shapes usually pay zero attention to usability or accessibility.
OS are close to this, pick any era of the constantly changing OS color schemes and widget design, and you'll find plenty of issues, with the basics of the basics - readability - suffering.
So again, why should everyone be generically bad just because they wrote the "guidelines"? Sure, change doesn't mean good, but the neither does using the defaults
Quite. And the era when everyone was trying to "do their own thing" with UI design wasn't exactly pleasant or usable. Just have a look at some of these designs, for example
Not to mention that the statement is wrong. Windows applications do NOT look the same, and that's bad.
Oh... except for their lack of a title bar, which prevents you from telling which application you're looking at. Is this PDF open in Edge, or Acrobat? Who knows. The windows look the same.
Beyond that... it's a disgraceful mess. You have applications now with no menu bar, but instead a bunch of hamburger buttons and "gear" buttons scattered all over the place. And common, standard functions like "save file" are further hidden behind "more" labels even in THOSE menus.
Another example of Windows's galling regression: the abolition of the File dialog in many apps, which have replaced it with a giant page of crudely-drawn, unlabeled, super-wide text boxes and a bunch of plain text. There's no file structure shown, so you have no idea where you are about to save a file... It's truly a clinic on dogshit UI. Pathetic.
Companies like IBM and Microsoft did a lot of HCI research back in the 80s, and made a lot of progress with usability and common idioms that all software followed. Then when displays with 256 or more colors became common, all that went out the window.
All those Windows Media Player skins were awful because they used so much screen real estate on dead space. Whereas the plethora of Winamp skins kept the economy of screen real estate while still providing unique and imaginative visuals.
The whole skeuomorphic trend starting in the mid-90s was similarly awful for the same reason. First, it was often hard to tell what was a control and what was just decoration. Second, it often took trial and error to figure out what was what. And, as I mentioned above, these designs almost inevitably wasted huge chunks of screen space on decoration that provided no functionality.
Of course, we have the opposite problem now. All windows look the same. Title bars are mostly gone. And since companies like Microsoft replaced all their HCI experts with art-school dropouts who think the "flat" look with low contrast is cool, not only can you not tell what app you're looking at. Half the time you can't even tell where one window stops and another starts.
The only good UI thing that's come out of the last decade or two is a near universal support for "dark mode". Otherwise, I would greatly prefer the Windows 2000 "classic" look, or something similar.
> The whole skeuomorphic trend starting in the mid-90s was similarly awful for the same reason. First, it was often hard to tell what was a control and what was just decoration. Second, it often took trial and error to figure out what was what.
I strongly disagree - do you often find it hard to figure out where the light switch is when you enter a room? Terrible applications are terrible regardless of whether they are modern or old, whether skeuomorphic or purely functional. But well written applications tend to have more affordance when skeuomorphic because people already know a lot about real world controls and their function.
I agree with your sentiments, but not your timeline. The mid-'90s was the high point for GUIs, with Windows 95 nailing it pretty much across the board.
And as you note, "flat" design is NO design. It's total dereliction of the design task. Fortunately we're seeing some steps back toward legitimate GUI, where controls are occasionally demarcated as controls.
A great example of Windows's pathetic regression is "dark mode." Since the early '90s (and I mean '91 or '92), you could set up a system-wide color scheme. Inverse color schemes were an unfortunate vestige of the late '80s, early '90s... the advent of the Mac, "desktop publishing," and the effort to make the screen an analog for a piece of paper. That analogy fails.
The result was millions of people reading black text off the surface of a glaring light bulb all day, every day. The first thing I did was set up a charcoal theme in Windows, pretty much exactly what all the "dark" schemes are today. And all properly written applications inherited it and all was good.
So... just in time for people to realize that this was the way, Microsoft REMOVED the color-scheme editor from windows. Only to have to hastily slap a hard-coded "dark mode" back onto the OS. So damned stupid.
EU has enough areas with sparse population and not that much nature which also are south enough to have it work out well with solar panels of the current generations.
And besides that even most EU countries have enough places in them to still put a lot of solar panels without much issues and/or replacing fields.
going as far as North Africa is a bit too far to be convenient for power transport
Yeah, completely analogous. Physical tools aren't subscription-based and prone to outages. Except when they are, but that's – luckily – still something that people feel strongly negative about.
And if my IDE or compiler (or computer!) stopped working because it requires a connection to the mothership I'd be livid. But I guess the cloud-everything, subscription-everything model has successfully made people accept an objectively worse world.
Reminder that this is the company that decided to replace Paint with something called "Paint 3D", the laggiest and bloatiest "literally nobody wanted this" drawing app I've ever seen.
It was absolutely sold as a replacement. And it's gone now because literally nobody wanted it, used it, or understood why it existed. Sure, you could still find the old Paint in a disused lavatory behind a locked door with a sign "beware of the leopard". It wasn't even installed by default, unlike the 3D version, or do I recall incorrectly? Even MS isn't so stupid as to ship two separate accessories both called "Paint" in the same OS by default!
And a weird obsession with making it impossible to customize the sidebar in Explorer, so there was a “3D Objects” folder stuck there permanently unless you’re the kind of user who doesn’t mind a trip to the registry editor.
What percent of users ever found that useful? I think I’m being generous to guess one in ten thousand.
Absolutely braindead management running Windows development.
For their default file explorer experience, the prominent fourth option right in the sidebar. Oh my gosh, that is hilarious. Did someone think it made the computer look advanced (or did they want you to buy apps to uh make 3D stuff from them)?
Had to basically reinstall my PC every 3 months (if i used vr in those times, which i stopped after a few reinstalls) because the mixed reality app somehow broke itself again with no amount of updates/fixing/reinstalling or terminal work fixing it.
I tried, i tried alot but all the hours were just wasted since only a clean install worked, for about 2-3 months until it just decides it doesnt want to open again.
The windows mixed reality portal has then made me stop playing vr completly about 5-6 years ago because i couldnt justify reinstalling everything every few months for a few hours of beatsaber and then like 3-4 years ago i FULLY switched to linux so now its just a paperweight anyway (i think they removed the support in modern windows anyway iirc)
Basically just waiting for the steam frame each and every day currently
I remember how Skype, an awesome piece of software transformed into Lync, which worked fairly well, slowly transformed into whatever MS wanted to call it year after year, slower and more buggy than the year before.
Lync started out as "Office Communicator 2007" before being renamed Lync. Then Microsoft purchased Skype and rebranded Lync as "Skype for Business" even though it was still a completely unrelated product, with just some basic interoperability slapped on. Skype-the-consumer-app lived on separately as its original product in parallel.
Just another example of how Microsoft is utterly incompetent at branding - always have been and always will be.
Looking back, I understood with Windows XP that I wasn't in the target group. Win 95/98 had a simple but functional file search. Being naive, I was expecting some power user features in the future, like regex search.
Win XP replaced the classic file search with one that had an animated dog in it.
The dog search was completely, utterly useless. You would not find anything with it. It was so bad I still vividly remember my bafflement about it.
Give FlowLauncher[0] or Windows Powertoys Run[1] a shot.
There are some amazing tools like that (and Everything[2], which replaces Windows' inferior search) that really change how one interacts with Windows.
There are other tricks like putting scripts or shortcuts or executables in a directory referenced by your PATH variable, which can make the Win+R trick better too.
Thanks! Also useful for an old win10 machine I have, and probably shouldn't be using anymore, that no longer responds to clicking the start menu button...
Don't throw it away. Install windows 8, and the last offline version of office you can find. It makes for a great distraction free workstation and a monitor for your android (scrcpy).
Or, you can install and reinstall linux distros and learn the ropes.
You should be fine as long as you use a proper firewall device and access only manually withelisted websites, but it is always better to keep it offline. That said, it can become your next firewall device.
I built it circa 2012 or 2013 and still have the physical win8 disc. I considered futzing with linux on it. The extent of my linux experience is via SSH to a raspberry pi kludging some docker containers for this and that. SSH/linux terminal feels like fumbling in a dark room flipping random switches until something works.
>scrcpy
I also have a pixel 5a whose screen doesn't work, but I think functions otherwise. Would this allow me to interface with it?
I think the pixel 5a can be connected to tv through an hdmi cable. If so, plug that and a mouse to setup adb (toggle enable adb and toggle debug permissions, then accept adb host)
Back in the 90s doing substring match was probably deemed way too expensive and so just calling the executable name directly was as optimized as it got... and it's beautiful :)
I think that crown belongs to the pile of pig shit that was the Windows 10 photo viewer. My first experience with that trash fire was opening a simple 2k photo which took 15 seconds and 150 MB of memory on a six core i5 with 16GB RAM. Viewing images was pain and suffering until I gave up and re-enabled the Win 7 viewer which was thankfully still included.
It was supposed to be a third-party replacement, sure, but certainly not an official one. It started as a student project. It's just the prefix that tricks your brain to associate it with MS's own .NET branded applications.
To be fair, the .NET brand is already super convoluted (there's .NET framework, the .NET core, .NET runtime, the .NET desktop runtime, the .NET sdk, and I'm genuinely not even sure which if any of these might refer to the same thing), on top of it weirdly sounding like something internet related to a casual user.
Yes, "Copilot" is not the first brand that MS has tried to stick to everything while being just as confused about it as (inevitably) the consumers. Although somehow they did manage to keep .NET mostly aimed at developers - besides the actual frameworks there's Visual Studio .NET and other dev tools, but I'm actually a bit surprised that they never had "Office .NET" or "Outlook .NET" or even "Windows .NET Edition" or something like that. Maybe they still had some sane people in charge of marketing and brand management back then.
I understand enough for it's arguments' symmetry to have an impact. Used Deep Research, it had the paper from the link as input plus some previous discussions about Tensor Logic and the new hardcoded neuroweb - like processors. Didn't make those up either.
"I just want to get this out of my hands in case I made the model stumble upon something important. It's reasoning seems solid but I'm no expert. Here it is, go crazy:"
Infinite demand, maybe, but not at wages that most people are willing to accept. Of course, if there's literally no other work, then previously-middle-class people will take what's available and become homeless because the wage doesn't pay the bills (which are, in places, extremely inflated due to decades of jaw-droppingly bad housing and transport policies). Sounds like a highly desirable future.
Um, I expressly said that high wages wouldn't stay? If the choices are either being jobless and homeless, or doing some menial cotton-harvesting job while still being homeless, we got a slight social problem. The GP said that there's a lot of demand for menial labor. That demand only exists if you don't have to actually pay for said labor. In other words, it's not demand at all.
This is a study of the feasibility of launching an intergalactic colonization wave (and its implications re: the Fermi paradox), not a proposal that humans should do that (it would be just slightly ahead of its time for that!), or a discussion of the ethics or higher-level utility of doing so. It would be refreshing to see someone discuss the things paper actually discusses. To use early-2000s terminology, the paper's future shock level is higher than that of most HN readers, leading to rather banal discourse.
In any case, I'm fairly sure the authors agree that sending mindless automata to colonize the universe doesn't seem like a great idea. Nevertheless some alien intelligence (including an Earth-based AGI) might find it a completely reasonable, even imperative, goal.
But sentient machines or uploads (assuming for the sake of this this thought experiment that they are possible)? That's a different thing.
But the thing is that the Fermi paradox isn't illuminated by scenarios that are technically possible but extremely unlikely. I get that all it takes is some subset of people who want to do it to make it happen, or as you say, some alien species that decides it's a good idea, but I'd argue that the idea is patently bad, and there's good reason to think that no species would bother -- not 100%, obviously, and the steel man argument would say it only takes a fraction of a percent, but I'm personally unconvinced that anyone would bother.
You two debated this as a philosophical or even moral issue, but it changes everything when you look at it from the natural hazard prospective. It doesn't even have to be a subjective matter. Picture this - you know the climate change is happening and you understand that some colony of animals will surely vanish because of that if you wont do a thing about it. Doing something could mean just taking a few pair of animals and relocate them to a safer area. Do you think that the survival of such descending colony of animals mean anything (to anyone)? Who can argue that it won't be our time (and obligation) to reduce the risk of having the only known capable civilization residing on only one planet or galaxy?
For sure, but that's my point; "taking a few pair of animals and relocate them to a safer area" is not what this paper is discussing. A better parallel would be "take digital photos of the endangered animal and circulate them around the internet." The proposed method doesn't spread us -- it spreads teeny tiny machines we made, for no reason at all other than to say we did it. And long after we're gone, when the Sun has died, far away galaxies will be polluted with little machines, each containing a copy of some data about us.
Right, but it’s only a feasibility study. By definition these only study the minimum system that could accomplish the goal, which was to visit as many galaxies as possible. Given the mass budget and density of the data storage contemplated there’s no reason the probes couldn’t carry enough information to create real human colonies in the process of replicating themselves.
> Today, all Windows desktop apps look the same as they are the same; they are all built on crap React, Electron, electronbun, and Tauri browser wrappers that mimic the real Desktop apps.
Desktop apps should look the same because they should use the OS GUI framework; that has nothing to do with React and Electron. I can't quite understand this argument; being webview based enables applications to look different from each other, like websites do, not similar. If they still do look similar, that's a good thing.
> The point was usually not usability. It was identity.
Yeah. And usability is sort of a big deal. Applications that implement their own widgets or color schemes or nonstandard shapes usually pay zero attention to usability or accessibility. They almost invariably lack all standard affordances and disregard the standard UX guidelines.
Also, ironically the applications with the most "identity" today tend to be control panels and other accessories by HW manufacturers bundled with device drivers, and they also happen to be the crappiest, most terrible bloatware that an average user is likely to encounter.
reply