> They have ruined web design. But I should probably write a whole article on that.
I sure am looking forward to that article, from where I'm sitting they absolutely saved web design.
Before mobile, I absolutely could not get any designer to grok that browser windows were flexible things, and most users were not looking at the web on full-screen browsers on desktop displays. The term "responsive design" didn't even exist until the advent of the smartphone, to the best of my knowledge.
Hell, we were still looking down the barrel of an eternity of MSIE 6 support until iOS Safari became important. (and the Youtube thing, of course)
So much of modern design is ruined by the existence of touchscreens. People seem to have forgotten that the design of a UI must be guided by the kinds of input devices it will predominantly be used with. Almost no one is designing for mice and keyboards. These days it's mostly touch UIs with some hotkeys and hover cards thrown in.
On laptops, they are the minority but somewhat common i think. I'd guess trackpads are still way more common.
On desktops it is mouse and keyboard.
And CSS still seems to be unable to differentiate between these three - or even just between mobile/tablet and laptop/desktop (though there are cases where a 27" desktop monitor is not the same as a 14" laptop monitor), which is one of the reasons things are so awful. So people use hacks like minimum width or whatever as shown here:
Now it might be w3schools being bad and there is a better way but every single site i see out there that uses responsive design (what a name) does that. Assuming they bother with desktop at all.
Not to mention that they assume that all people use browsers in a maximized state. I have a huge monitor so i always keep browsers at around 1000px or less, which often triggers the "oh you are on a mobile, lemme bloat up everything for you" style.
In my personal project, I simply have two sets of templates and which one I use depends on the user agent. It isn't possible to adapt the same exact markup to two input paradigms this different without compromise.
Once Mac OS (and OSX) followed the ulititarian design from Brawn.
Now you have "material" (Bauhaus) design... done TOTALLY wrong.
The form doesn't follow usability at all. Just a posh design trying to
mimic a static format as if the computer screen was a flat magazine cover or
an information panel on a building. Zero usability.
We need the System 7, Be/Haiku, Windows 9x... design back. Now.
Problem is that in many companies designers are given absolute reign over how the product looks. And another problem I see all the time is that designers often treat the result of their work as an art piece, not as a tool that has the purpose of human-machine communication. So, yes, it looks nice if all you do with it is look at it.
Flat design has only one good thing about it: it can be easily done with vector graphics. Which is important for modern high-dpi displays. But I'll gladly go back to using a bunch of bitmaps for every button if that meant the UI would become more intuitive.
> We need the System 7, Be/Haiku, Windows 9x... design back. Now.
Skeuomorphism (OS X 10.9, Windows 7) was nice too tbh.
Among what? Are you telling me that you would design a desktop website with touchscreens in mind despite 99.99% visitors using it with a keyboard and mouse?
Most traffic is mobile these days (presumably mostly using touchscreens). There's various sources for this, like: https://www.statista.com/statistics/277125/share-of-website-... . Specific sites will vary, of course. I'd be curious to hear the ratio on HN.
Eh, mouse-driven UI was already bad. At least by now the computing universe is bifurcated between consumption-focused touchscreen UI and proper keyboard UI. I haven't hunted through an endless field of tiny icons in ages.
Just an hour ago I got reminded how user-unfriendly some terminal utilities are.
# ln
ln: missing file operand
Try 'ln --help' for more information.
Yeah what a helpful error message. Yes I know it takes two file paths to link one to the other. But for the life of me I can't remember which order it wants them in.
I wish there was some sort of "parameter hint" for the terminal, like in an IDE. You type `ln` or `find` or `grep` and it pops up above the cursor and tells you what parameters the command expects.
Many sites seem designed for two kinds of devices: huge 29" screens, or small 5" screens. If you use a "normal" laptop things often look less than great. It's not uncommon for some fixed header or whatnot to take up a quarter or even a third of the total screen, especially if you like to zoom text a wee bit.
> Many sites seem designed for two kinds of devices: huge 29" screens, or small 5" screens.
I don't know about the 5" screens since I rarely browse the web on my phone, but it's pretty clear to me that most sites are not designed for huge screens.
I have a 4K 32" screen, and I've lost count of the number of sites that will expand to take up all the available width and show absurdly long lines of text. And since I'm using a tiling window manager, the window is basically full-screen.
The problem here is twofold. The first is that css still makes it really hard (AFAIK) to do multi-column text flowing (aka newspaper style). The second is of course using the tiling window manager, I expect that a web page uses the entire browser canvas area its presented with (aka its not putting in 16" margins around a 2" wide bit of text, as is the other common failure). So, I'm fine with the entire web page becoming a single line of text if I give it a 8k pixel wide/40" screen. I will resize it to something more appropriate if that is its problem.
edit: Well I would delete this comment, because it looks like its a fixed problem, I guess not doing css for the past 4 years, i'm a bit out of the loop.
Although its still not 100% reasonable (if i'm understanding), there should be a way to instead of specifing the column counts rather specify roughly the column attributes and have the browser determine the column count and flow the text. You would think `column-count: auto` would do that but at least on ff, that just means `column-count: 1`
I wasn't even thinking about doing dynamic multi-column layouts. But just put some sort of maximum width. Sure, there's going to be a ton of empty space, but that's less of a hassle. At least the text is easy to read.
Well that is just a large (if not larger problem). Where should one set the limit, I despise web pages that are just a long pencil line down the middle of my screen where 90% of the realestate is wasted as much as you despise the ones that paint the entire thing in one giant wall of unbroken text. Hence my comment about the solution being multi-column newspaper flow. Your average newspaper still probably fits 50x the information density on a two page layout than my computer is doing with three high resolution monitors.
Although, part two of this are the really irritating web pages that refuse to vertically scroll more than 50-100 items because it breaks their pagination. Yielding a case where I'm "shopping" (looking at logs/whatever) and the items are only filling the top 1/3 of my monitor. Part three are the applications that scroll but can't accurately reflect where in the wall of text one is actually at (think slack/etc).
> And since I'm using a tiling window manager, the window is basically full-screen.
Tiling wm works well for me with tall/narrow browser windows utilizing the "phone view". I rarely have full-screen browser window, it's more often 1/4 to 2/3 screen width.
> Before mobile, I absolutely could not get any designer to grok that browser windows were flexible things, and most users were not looking at the web on full-screen browsers on desktop displays.
Except, before mobile almost everyone was looking at the web on full-screen browsers.
Hell, now its hard to convince people that you don't want to download an app to use a webservice.
> we were still looking down the barrel of an eternity of MSIE 6 support until iOS Safari became important.
It was Chrome that really dislodged MSIE. And, frankly, at least MSIE believed in keeping legacy features available. Whatever else the issues were (and they are legion), MS hated bitrot.
Full screen desktops before smartphones were also in the 15"-19" range. The idea of a 27" monitor being close to normal is far more modern than smartphones.
Flash for general websites and for navigation was hellish. But Flash for games? It was a boon. I really enjoyed the early/mid era of Flash games, Kongregate and the like.
I'm not arguing for Flash games now, I think there are much better tools today. But back then it was a boon to indie/experimental game developers (and animation artists) and enabled them to develop relatively cheap games and reach a wide audience.
Fluid design is easy once you abandon a misplaced obsession with pixel-perfect layouts. The example here works on displays from 3" to 30" (and presumably larger):
When I'm standing on a corner somewhere and I can use GPS, live traffic data, restaurant reviews and simultaneously make call to someone, at that moment I don't think about my beloved PC at home, sorry.
You can always use a burner phone with a fake identity if you want to stick it to the man.
> You can always use a burner phone with a fake identity if you want to stick it to the man.
For the uninformed, living in the U.S., how do I get a burner phone that isn’t associated with my identity? I’ve looked in the past, it’s been a few years admittedly, but it seemed non-trivial.
USA is probably the only country in the world where you can still buy a cellphone/smartphone with a working number somewhat anonymously. Walk into any convenience store/gas station and buy a cheap phone with a prepaid plan and pay cash. In most other countries they are required to verify and scan some official ID first.
Better yet, have the teenagers/panhandlers lingering around outside to go in to buy it for you. Also, do it in another town than where you normally live. Don't drive there in your own car. Wear a mask and hats and whatever else. Dodge the cameras.
As you can see, there are many additional steps that can be done to ensure anonymous use of said burner phone.
A burner phone is really only useful one time. Each time it's turned on and used more and more statistical indicators are emitted.
- The location the device was turned on, used, turned off.
- The devices contacted by the burner phone
The kind of graph databases the government uses can round down the number of people that could be using any burner phone to a reasonable guess after just the first power on and contact (call/text). This is possible with only metadata. If you have the persons voice and message content it's even easier. As the phone stays on and contacts multiple numbers the graph database increasingly narrows down the possible user. I would guess within 3 calls/texts (not considering location data, or god-forbid other smartphone data) you would have 99%+ certainty who the user was if those interactions follow any previously known pattern.
Nonsense ;) You just can't practically and cheaply hide. The various world governments regularly hide things from the other world governments. Crime organizations manage to stay operating, etc. I forget the exact statistic, but the FBI's most wanted list has a 90+% effective capture rate, but that leaves some that manage to hide sufficiently well as individuals. How much of that is luck vs skill is unknown by me at least.
To be fair, I basically agree with you, it probably takes 1 mistake to leave a useful to trackers digital footprint that can get back to you, so it takes a lot of skill/luck to leave no digital traces these days, but it is arguably possible, even for individuals, it's just not remotely easy.
To think it's as easy as using a VPN is ridiculous. It's generally not the technology, it's the relationships and habits that get you tracked. I.e. It doesn't take a genius to see the communication logs and figure out the close relationships. We all regularly communicate with our loved ones and do the same sorts of things day after day.
> there are many additional steps that can be done to ensure anonymous use of said burner phone.
It depends on who you want to be anonymous from. A spouse? Ad tech companies? Local cops? The FBI/Homeland Security? The various intelligence agencies of the world?
US is far from the only place where this is possible. It's entirely possible to use cash to purchase a used unlocked smartphone plus a prepaid SIM card here in the UK.
UK actually handed me a SIM at the Airport (or did I just pick it up from a counter) and I used it throughout my stay for a few months. Isn't that the case anymore?
It's probably a courtesy thing that one company or another arranged for you, or some kind of a promotion. I never had this happen to me.
It's generally the case that if you want a prepaid SIM card, you go to pretty much any shop to buy it for a nominal fee (usually just £1, but shops that rip you off may charge up to £5 in some cases). As soon as you buy and activate a top-up voucher, you are good to go. At no point is your identity or a card payment required.
You might be surprised at the number of cameras that are fake, or not actually recording. It largely depends on whether the business has insurance and if the local police actually investigate petty theft.
Convenience stores and gas stations in the US are frequent targets of armed robbery, which is what the security is for, not petty theft. (There may be some that count on camera shells as a deterrent rather than having live cameras, but if so it has nothing to do with whether the police investigate petty theft.)
In Ireland you can buy a SIM card without ID and pay cash. Similarly with mobile phones too -- I would very much doubt the US is the only country where this is possible. Data protection is generally taken much more seriously in Europe than the US.
Not any. In Taiwan for example, buying a prepaid sim + service still requires a form of state/national identification, even when they're purchased from a 7-Eleven.
US is not the only country in the west where you can buy a SIM anonymously, though certainly one could say it would probably be the last country in the west such law would be implemented.
The EU really tried(and succeeded, for the most part) to pressure european countries to implement mandatory ID when buying a SIM card.Certain countries have withstood this push from Bruxelles, not surprising more than 50% of those on the eastern side of the continent [0].(To give a bit of a context: communists 'used' to do this back in the day and it was very known --monitor communications-- that's why many people don't want it, even under the umbrella of "protection from terrorists")
> ...how do I get a burner phone that isn’t associated with my identity?
What adversary are you worried about? Taking to a divorce lawyer without your spouse knowing? Not having higher-skill cybercriminals bypass your SMS-based 2FA via SIM-swap? Outsmarting the NSA? The answer really depends...
Walk into any Walmart and buy a TracFone. It's about as trivial as can be.
This is what I used to do back when I was poor and couldn't afford a cell phone plan. It was pre-smartphone, but I see data included on the prepaid "minutes" now.
Before cell phones I used to buy minutes on calling cards[1] at Walmart in order to call long distance.
As foreigner who traveled to the US a couple of times: how do you get a phone number that is associated with your identity in the US? I recall only needing $60 and a compatible phone to put the sim card into.
I don't think he's saying you shouldn't be able to use maps and call people.
The problem is that we've gotten used to devices which turn us almost exclusively into consumers instead of using computing devices more as Douglas Engelbart envisioned them: human intellect force-multipliers.
Part of the problem with phones is intentional (massive centralization), but some can't be helped (input capabilities are very limited on a phone).
Can I just say "phone calls" are something that dumb phones can do. GPS and live traffic data are, essentially, the same service in that they fill the same need. Restaurant reviews? Who actually needs those except maybe on a road trip? And then it really seems like a GPS-style feature.
It seems like the phone as GPS+ is use-case you identified.
I will say though, even dumb phones have web browsers, and have for a long time. You can access Yelp over the internet on a candybar from a decade ago.
GPS and live traffic data are both "route me to X". I don't think people really have another use case for either, although in some cases the "route me to X" may include "X is a close gas station" or "best route is to wait 45 minutes before leaving".
Do people really choose new restaurants while out (unless on a road trip)? I can see using Yelp while planning to go somewhere with friends, but once you're actually outside the house? In other words, I don't see a use case for Yelp outside the computer, with the exception of a road trip.
> GPS and live traffic data are both "route me to X". I don't think people really have another use case for either, although in some cases the "route me to X" may include "X is a close gas station" or "best route is to wait 45 minutes before leaving".
None of this is true.
> Do people really choose new restaurants while out (unless on a road trip)?
Yes, very often. Maybe I'm at the pub with my partner and we decide to get dinner somewhere. I'll use reviews to find something. It's a very very common use case.
> Maybe I'm at the pub with my partner and we decide to get dinner somewhere. I'll use reviews to find something.
You go out to a bar before dinner? I suppose some people do. I've always thought of dinner as starting an evening out, or at the very least being part of a plan (eg, Show X followed by dinner at Y)
The people I know who bother to look at yelp are planners and often choose a restaurant to meet at quite far away because it's special. The people I know who decide things spontaneously don't bother checking yelp.
Since you tried condescending at the end of your comment: I suggest you try places that smell good from the street and not outsource your thinking to an app. It can be pretty great.
Buy a TomTom? GPS is already built in to most cars (although they may be phasing it out if they think everyone can just use their phone.) But I know people who have dedicated GPS units (TomTom, Garmin, etc.) in their cars. They do provide better directions than GoogleMaps/AppleMaps.
I think the browser on candybars was annoying, but it also was sufficient. Not for daily browsing, but for occasional lookups.
> They are unequal devices. Smartphones are unapologetically devices for consumption. In this regard they differ critically from PCs, because PCs are equal devices in the sense that the same device is used for creation and consumption.
False. I bet the vast majority of content available online is now (think of pics and short videos) created on smartphones.
> They are not real network clients. Smartphones have powerful CPUs and fast network connections, except that you aren't actually allowed to use these resources in any meaningful sense, because doing so consumes battery power, and people don't want the precious battery life of their phones drained unnecessarily.
Falsy (maybe truee in 2016 when the article was written)
...
> They have led to massive centralization. Part of the “cloud” movement is probably driven by the fact that while smartphones have substantial computational resources, you can't actually use them because of battery life. So instead the computation is done in the cloud, creating a dependency on a centralized entity.
Maybe true
> They have ruined web design.
Nope, poor "mobile first" (aka let's apply mobile ui patterns to every other devices) ruined webdesign
If smartphones count as creation machines because they can take pictures and perform some basic editing, then my DSLR or Polaroid camera also counts. Even my pre-smartphone candybar from 2008 would count.
Unfortunately, smartphones, due to their form factor, don't actually allow precise editing or manipulation of content - which is what is usually meant by the idea that "smartphones are devices for consumption, and not creation".
The same applies to text, to some extent. Yes, you can author a novel on a smartphone, but it's gonna be an excruciating process if you want to do any fine text manipulation at all.
You can find full programming environments that run on phones and i amused myself for some time while waiting for a friend at a coffee shop by writing a simple C program in Vim with Hacker's Keyboard and Termux.
But just because something it is technically possible it doesn't mean it suited for it.
My washing machine or Minecraft redstone blocks may be Turing complete, but realistically they are the best interface to express what the mind intends to express.
These are mostly used as Ipad apps, and even then, they nowhere provide the tools that are available on a standard PC. Sure, you can use them to do cool things, but I can also create paintings with just ketchup and fries. That's not what 99% of people who have fries in their hands do/should do.
Are we really going to make the case that Garageband and iMovie allow for "precise editing or manipulation of content"? Last time I used either, they were like watered-down versions of real software that railroaded you through different pre-made templates.
Wanna make music? Press the chord buttons to play a tone on the synthesizer preset that you can't modify!
Wanna edit video? Drag your clips into this timeline with 3 transition effects and a half-dozen compositing themes!
Smartphones count as creation machines because people create content on them. If you take a look at Tiktok, Instagram or many of the other social networks, you'll see countless pieces of content that are created by smartphones each day.
If you got that form that comment, I don't understand your first comment. It's clear to you that smartphones are being used to create content, but you still think they aren't content creation devices, because they lack precision?
If content creation devices aren't the devices that create content, then what are they?
> If content creation devices aren't the devices that create content, then what are they?
I think that's a very good question. It's clear that devices like that are useful for some form of content creation, but I'm don't know if the term to refer to them has been coined or not.
Intuitively, they don't seem to fit in the category we used to call "content creation", since that nearly always was synonymous with creation+precision, rather than "any kind of creation".
I hate to sound like a nostalgic fart, but remember when content creation meant making never seen before flash games and animations, or long video content instead of 5 seconds jokes and girls dancing half nakedly?
It's true about content creation. Tiktok, Youtube, Instagram, etc. are done using mainly smartphones. There's a ton of content created on smartphones than he realize. Heck, there are movies shot on the iPhone: https://www.cashify.in/12-movies-which-you-never-knew-were-s...
The media is often captured using a smartphone, but I don't think I would call unedited media "content". I reckon that the editing process is more likely to be done on a PC/Mac/Workstation than the phone itself.
OP's point was that basically smartphones did not allow you to "create" at all. I would consider that most content by that larger definition to be created on a phone (Instagram, Snapchat, Facebook, Twitter, etc.) and even if it was edited on a PC to be using media created on the phone first.
Only photographers, influencers, and etc. have the time or need to do a full pro-camera -> Editing Station -> Publish cycle for the majority of their content.
Professional stuff maybe. But for casual ones, you can edit in the phone and upload. I'm thinking there's a lot more of that going on than professional productions.
> False. I bet the vast majority of content available online is now (think of pics and short videos) created on smartphones.
The actual main point he was making is that you can't use the device to create things for the device, things that can make use of the device's capabilities. In effect the device is a console. And consoles are purely for consumption. Sure, consoles allow you to make recordings of your gaming sessions, etc., but they can't be used to create the games. And gaming recordings are not content, but a byproduct enabled by the actual content.
>False. I bet the vast majority of content available online is now (think of pics and short videos) created on smartphones.
Even if this isn't true, it's not a great point. Most people didn't use their home PCs to create content anyways, so the ability to create content more or less went to waste.
The worst thing about smartphones (In my case android) is how closed they are.
Rooting is getting more difficult, bootloaders are getting more and more unlockable, SafetyNet and this new Play Integrity API are simply user-hostile.
There is absolutely no real rational reason to deny root access. Sure, non-technical people could mess something up, but if you make rooting just a combination of putting it in e.g. the fastboot mode and then running a single command to get root access. (Factory-State: No root-access), nobody would accidentally mess up the phone.
> Smartphones are unapologetically devices for consumption
Speak for yourself, my phone is for communicating with friends and recording memories.
> This cultural equality is diminished by an exodus to devices
What the heck is "cultural equality" and what do people using smartphones have to do with it? Does the author think nobody owns a desktop/laptop anymore?
> They are not real network clients.
Neither is your laptop if you want good battery life.
> They have led to massive centralization.
Is that why Discord is popular on mobile, and not on desktop? Wait, no, it's popular regardless of platform.
---
How does this kind of article make it to the front page of HN, seriously?
>Does the author think nobody owns a desktop/laptop anymore?
The number of households where the only compute device is a mobile device is probably higher than you give it credit. I know it was for me. I'm not including "smart" devices or IoT devices. These same households may not even have an internet connection in them so that the mobile device is their only method of data consumption.
Presumably, those households without internet connection don't really have the option of a laptop or desktop then, no?
In that case, smartphones seem to be pretty great at giving people the option to participate in the internet. I know hobby animators, artists, and musicians that produce work solely on their smartphone.
>Presumably, those households without internet connection don't really have the option of a laptop or desktop then, no?
While this probably leans to yes more often than not, however, it is 100% totally possible to own/use a laptop/desktop without using the internet. See all of computer history before the internet as an example. You can totally animate, "paint", create music without being online. Again, see all of art history pre-internet.
Sure, perhaps not 100%, but I don't see how this lends any strength towards the article's argument that smartphones are making the world less equal due to some perceived content-creation imbalance between the two platforms. How much content do you see created on computers not connected to the internet versus created on a smartphone?
I have been in several post production facilities that are not connected to the internet on any production machine. Locally networked to storage and what not, but no internet access at all. These are the types of places that work on feature films and television content that the studios are concerned with leaked footage. I have been involved in getting MPAA certified in multiple facilities.
Obvioulsy YouTube creators don't give a shit about security of their content to the level of MPAA facilities, so it will definitely skew to higher non-secure internet connected creators than secured non-connected. But it is not 0%
That's nice and interesting, but I'm not sure what any of that has to do with the point that the author is making in TFA, nor the comments that I've made.
>>How much content do you see created on computers not connected to the internet versus created on a smartphone?
Me personally, a lot. I even gave anecdotal supporting evidence, and with that, you can assume you've seen some of this content as well. The vast majority of my own personally created content is also created on non-internet connected hardware.
>but I'm not sure what any of that has to do with the point that the author is making in TFA
I find it interesting that people who are against the mainstream technology trends (smartphones, social media, cloud computing, whatever else) of today will always make the argument that the technology available X years ago when they were most comfortable with it was the pinnacle and everything has gone downhill since.
While the author is singing praises of a desktop computer with XMPP and self-hosted websites, back then there was a similar movement of people ranting against all this "modern tech" and saying we should just read the newspaper, use telephones to communicate and go out of our houses more. He probably fondly looks back at his late night IRC sessions and LAN parties while there were people writing "I don't like Computers (1996)" articles.
Talking in absolutes makes for contradictions indeed.
The thing in some areas computers today are much better than they used to be but that doesn't mean everything is better. There are cases where computers are worse.
And yes, there were people years ago who were saying similar things, that doesn't mean they were always wrong, it can also mean they were noticing things that others didn't notice or cared about. See, for example, Wirth's plea for lean software in 1995 - software certainly hasn't gone better on that regard since then.
I'd like to have a good dumbphone. Here are my requirements:
- good keypad (no accidental double presser, clear tactile feedback)
- voice calls, SMS
- 4g wifi hotspot
- power management good enough to not run out of battery when used as a wifi hotspot and plugged in
- good ui for the requirements above (e.g. ability to choose whether wifi is shut down if no connected devices)
(bonus points for being rugged and waterproof)
My previous try, nokia 8110 new version nailed 2 out of those 5, and as a result, I gave up pretty soon. To my knowledge there is no phone in the market at the moment that fills these requirements, and I find it odd.
Yes, I don't understand why there aren't more feature phones with full hardware keyboards. This, combined with the GP's requirements, would likely convince me to shift to a feature phone.
Take a look at the rugged KaiOS devices. They should do what you need.
I have a "Flip IV" right now (AT&T sent me one when I hooked up my 8110 to their network because that doesn't support VoLTE), and other than being waterproof, it meets your requirements nicely. There are some waterproof devices with the same features, though.
If you could link to specific models or vendors you could recommend I'd appreciate it. It's not clear from quick browsing whether or not KaiOS is a specific hardware vendor or an operating system that can be used on other OEM devices.
The first link you find, given your search engine choices, localisations, and/or personalisations, today, may not be the first link I find. Or someone reading this in a week, or six months, or six years.
Please for their sake if not mine, post a specific link.
I have a similar shopping list and very little that satisfies it.
General commentary:
- A laptop or e-ink tablet can run other voice comms (Jitsi, Zoom, Skype, etc.) These are also generally more capable systems and convenient to use when seated or established at some location. The dividing line between present-gen phones and smaller tablets (beginning at 6") is ... slim.
- SMS is itself not secure and highly problematic. As much as I'd like a dumb phone, I'm not sure I'd trust it to even SMS. Voice comms alone, or support for a secure encrypted messaging system. SMS itself seems too limited to allow an encryption extension, though xmms might suffice.
- Splitting the phone and the hotspot functionalities is an option. That's an additional service, but might be preferable.
- For the phone, I'd prefer a monochrome display, preferably e-ink, with a backlight or frontlight for low-light conditions. Hardware keyboard similar to the Blackberry, Palm Treo, or Palm Centrol might suffice.
Otherwise, a B&W candy-bar phone along the lines of the Nokia 3310, or a flip phone like the Ericsson T28 or Motorola Razr would be a welcome sight.
The feature I'd most appreciate in any of these for the present which wasn't available in their initial deployments is strict whitelist-based call acceptance. Any unknown numbers roll directly to voicemail or are rejected entirely.
Were you ever a fan of the blackberry form-factor? You may enjoy the Unihertz Titan Pocket.
It’s still a smartphone in that it runs a flavor of Android.
But they nailed the tactile keys I remember missing and still miss about phones with physical keyboards.
Not my daily driver, just something I had been following since the kickstarter and ordered one to fiddle about with. I ended up turning it into my “work phone/pagerduty device”. For that particular purpose, it works very well. YMMV
> Yes there are serious problems with smartphones.
And they become more serious over time, as we've moved more and more of society and interaction onto them...
> And smartphones are here to stay.
I don't think this is at all a given, though. If more people start rejecting them in favor of something simpler, with less permissions, that's closer to a legacy communication device, it allows others to consider that, yes, you might be able to get away without one. Or with a less capable one.
> So how do we fix these issues?
Stop using a smartphone in your personal life, and be at least a slightly vocal "that person," because a lot of other people I've talked to don't like smartphones either for a wide range of options - but don't consider there to be any alternatives. So when I can show off a device that still lets me do voice/text, can check email if I care, has basic mapping, and... not an awful lot else, it's an option that most people quite literally didn't know existed.
Legacy (style) devices are missing one killer app category: data-based messengers. Without a fully-featured version of WhatsApp, Facebook Messenger, Line or whatever it is that your friends and family use, you'll be limited to 1:1 chats. Group chats are an app-only feature, and you'll be excluded from those.
An open Linux-based phone that could spawn Android VMs for each such closed messenger would be awesome. It wouldn't solve the underlying problem of having closed protocols to being with, but nobody would use the open phone if it doesn't do what they need right now.
In many countries you're forced to use closed apps to log in to government services, banks, and whatnot. If those could run in VMs it would also be rather nice. (Some of them actually try to detect if it's being run on a real phone or not).
KaiOS supports WhatsApp. And, at least some of them support group texting somewhat competently, though not the older ones.
But the reality is that I'm fine with a lot of my communication being computer-based now (with a keyboard). I still manage some group texts, but tend to only actually contribute if it's something critical ("Can you make this date work?") and skip a lot of the random BSing in the threads.
If your requirements are "I want everything I can do on a smartphone, but without a smartphone," you end up in an impossibility, so just find the least offensive black mirror you can and go on with it. But if you're willing to sit back and figure out what actually matters from a phone, and what's a nice-to-have, there are plenty of other options out there.
Travel backwards in time. Get the oldest smartphone you can. Then next year, get an even older smartphone, then go back to a flip-phone, then regress into a candy-bar phone, every time a device manufactured earlier and earlier, with fewer features, fewer chips, fewer peripherals (cameras and mics), less bandwidth. Removable battery. Memory just for the numbers of your contacts.
Unfortunately, that doesn't work, because the old cell network radios don't exist anymore.
You can get a modern flip phone or such, that supports things like VoLTE (AT&T is dropping support for all non-VoLTE phones here... soon, if they haven't already), and there are some candybars, but going with "older devices" will soon enough prevent you from connecting to the cell networks at all. 2G is gone in most areas, 3G is going away, etc.
About ".....old cell network radios don't exist anymore...."
I've often wondered if it would be possible to build some kind of proxy?? hardware/software to solve this problem. Maybe some combination of tx/rx modules for old and new networks. Maybe use a SDR for the old tx/rx module as it would only need to tx a short distance to the old phone
You could, but if you're doing that, it would be far easier to just build your own device that does what you want and nothing else. You can put together a ZeroPhone or something for about the same complexity as a network proxy device that might have to do things with group texts to make them look valid to an ancient device.
And then you're carrying an ancient device plus another proxy. At some point, it's easier to just buy a halfway recent flip phone and use it until it dies.
Technically possible, sure. Practical, I don't think nearly as much.
Plus, if you're actually rebroadcasting an old signal over the air, then the FCC gets involved. And that's an entire new can of worms.
I have worked in the smartwatch industry and have a background in mobile operating system development. I also run a security consulting business with over a dozen active client relationships at any time. Between the two I developed the feeling I -must- be connected and reachable at all times even when sleeping or on personal time.
Recently I reached almost every conclusion this author did independently.
Dropped my cell carrier a year ago downgrading my phone to a wifi tablet and ditched even that device as an experiment a few months ago and have no desire to go back now. I am reachable when I choose to be at my desk. No notifications follow me to the bathroom or to the bar table with my friends. I feel like my life is mine again.
I am finally -present- in everything I am doing and am finally after over a decade am capable of being alone in my own head without frantically searching for a phone to tell me what to think about.
I leave home with no electronics other than sometimes a digital music player and just take note of the world around me. No electronics are allowed in the bedroom at all.
+1000 would recommend to anyone. Smartphones and the need to be -constantly- distracted is an unhealthy addiction for most that have a desktop or laptop computer they can access intentionally when needed.
They do have at least one redeeming quality: smartphones are a cheap, portable gateway to the internet for millions of people worldwide who cannot possibly access it any other way.
Do you think people should be excluded from communicating with each-other (and the wider world) because they can't afford a PC, or because their country doesn't have good internet infrastructure?
I can't answer that with a definitive "yes" or "no". Your example is something obviously good, but there are obvious downsides to this equation also.
Lately, I've been thinking the world would be much better off without this easily accessible casino/skinner box/echo chamber/surveillance state. A lot of social cohesion problems really started with widespread access to smartphones, in my opinion.
Not sure if you've noticed, but everyone having a say has eviscerated society and our ability to communicate with one another. We didn't evolve to be part of a global community where everyone knows everything going on and everyone has a say on every conversation going on in the world. The Internet was better when it took a little work to get access to the information you wanted.
Egalitarian cries of "Internet for everyone!" have devolved into bullshit that ruthless companies looking for more profits have co-opted. Again, it's not like it would have worked out anyway.
> For example, a disproportionate number of IM (XMPP, etc.) clients for, say, Android, appear to rely on a central server operated by the software maker, with some proprietary protocol between the client and that server, rather than simply implementing the protocol directly.
Of note is that old Nokia devices used to allow this (i.e. arbitrary TCP connections including long-lived instant messaging ones) and they reached longer battery lives than any Android device I ever had.
They traditionally claim that you just can't have multiple long-lived TCP connections due to the keepalives keeping the radio powered on constantly, as well as the reconnect traffic (due to e.g. roaming) being significant. However, both things are fixed by coealescing all keepalives. Probably this doesn't scale to thousands of connections, but it does definitely scale to a couple tens.
In 2016 this claim may have made more sense, but today in 2022 several smartphones get a full /64 IPv6 address while your laptops are under however many layers of CGNAT it takes your ISP to give you connectivity by putting in the least effort.
I believe there are even online banking applications which reserve the right, in their terms, to detect if a device is “rooted” and refuse to operate on them.
This drives me nuts. I want to improve my security posture and user experience by owning and controlling my own phone and breaking its link to the vendor, removing shitty bloatware to reduce the attack surface, etc.
I'm happy to take responsibility for applying my own updates / patches. Samsung and Google don't have a monopoly on being competent to secure a device, and as the device ages they do an increasingly shittier job of that than other options out there.
I tried to put an unrooted Lineage on it, but the banking app closed on start. A lot of people in the Play store are complaining, but every answer of the developer is: "We updated it, try again."
Phones used to have tactile buttons you could dial without even looking. They also worked in extreme cold. Touchscreen? Below a certain temperature, the 'buttons' cease to exist.
I think it would be cool if smartphones had a paradigm of user-controlled back-end server. All the push notification registration and compute-intensive or network-intensive operations could run on a home server that a phone is paired to. I already choose apps with an option for a self-hosted backend where possible, but its not the default. And I can't run my own Firebase Cloud Messaging server or my own APNs at all.
A lot of smartphone users have IoT devices plugged in at home that, for better or for worse, have a persistent connection to the internet. It would be feasible and useful for these devices to act as personal servers for certain kinds of services. In fact I believe Apple already does this with HomeKit, where users’ AppleTVs or HomePods can act as a “hub” to facilitate communication with devices at home from outside the local network.
I just got a Google Pixel, after a few years on Samsung devices. I've rooted a few times in the past and just went through the process again. When I have a rooted smartphone I really feel like I have a mini computer in my pocket, otherwise it feels like I have this fancy content consumption and notification tool.
I've gone for many month stretches with no phone, and learned to live the old fashioned way again. Arrange meetings, look up directions before leaving, etc. This tended to give me problems in my marriage though as I wasn't available to be contacted while I was out. I was considering a "dumb phone", but I enjoy using slack and having my work calendar available to me, otherwise I will forget meetings.
There's another reason not to like smartphones i saw here a while back. They are surprisingly difficult to use on a fundamental level. A touchscreen is much clunkier then a cursor and mouse. The small screen makes it hard to browse the web and the tiny touch areas means that it's very easy to press the wrong button by mistake. And of course, a smooth, glass screen gives no tactile feedback. The actual device is just a smooth slab of glass and metal and is quite clunky. These are fundamental things that smartphones are very bad at.
I do find it a bit ironic that this person says, "They have ruined web design"... have you seen your site? Anyway... other than that... pretty interesting article.
Why do we need the responsive mobile web design anymore? Honestly i think it s an unnecessary complication that keeps most of the website content out of view, nullifies zooming, and slows the development of any site . Most phones have great screens , the user can zoom in and out, landscape mode is as big as a laptop screen , it's probably a better experience. Scrolling vertically only is so ... unidimensional
I also don't like smartphones but I accept reality. So while it won't solve 90% of the article's complaints, I desperately hope we will move out of a time when we have this item that takes us out of live experience and instead, we start building out technology that helps us enjoy our live experiences more fully, especially in regards to communication and attention.
I find it completely hilarious that I can't use my smartphone to do some programming, even a few lines of python.
It's kinda true that unless an app can make money, a quality app has almost zero change of existing, while it's really not true on desktop. The APIs change too often, too much planned obsolescence, etc.
I like my smartphone but I agree with this article.
> For privacy/security use a non-Android Linux-based phone.
I agree entirely, mostly because there's not a huge difference between that and simply not carrying a phone... or your battery is dead unless you carry a bag full of spares.
The state of non-Android Linux phones (PinePhone, Librem 5, etc) leaves an awful lot to be desired, and I'll argue that for most of the stuff that actually works on those devices right now, you're better off carrying a random flip phone that works on modern networks, with week+ battery life. They'll do about the same things, but the flip phone is far cheaper and more likely to actually send/receive phone calls if you need it to.
I'm glad the non-Android Linux devices exist, I have a PinePhone I'm messing with, but they're not really daily driver grade yet, without quite a bit of pain.
Legacy mobile phones are becoming flaky on modern networks. I had to go up all the way to a phone manufactured circa 2009 to get acceptable call quality and a connection without dropouts.
> For privacy/security use a non-Android Linux-based phone.
The problem with this is that a lot of services don't make apps for this (banks, gov't services, messenger services, etc.) and that the value of such a phone is hugely reduced.
It's basically Microsoft Windows anno ~2000 all over again, except now we can "choose" between Android and iOS (marginally better, I suppose).
Oh, the difference between tech nerds and users. No. Users don't modify their phones (or their computers.) Users don't need Windows or some new phone OS so they can do obscure things only 1 in a 10,000 people care about. Industries don't serve tiny minorities of tech savvy users. Most people just want and need a cheap Mac / iPhone. Too bad that doesn't exist.
Smartphones are extremely useful tools we have in our pockets. The problem is when we cross the line and instead of tools in our benefit, we become the tools at the hands of the technology. Self control is something that not many of us have or something that many cave in easily at the hands of abusive corporations.
Much of this also applies to tablets, and the comments here apply largely equally:
- Data is siloed into apps. If it’s stored locally at all its within app-specific databases. In many cases it’s principally stored on cloud.
- Applications cannot interoperate / share data. At best they share URLs or clipboard content.
- The filesystem is not univerally user-accessibly.
- Application process management means that any app can be closed at any time. Any unsaved state, and there’s frequently a lot, may disappear. Even within apps, content is unstable. E.g., if I am composing a longer comment or post and am researching it in other browser tabs, odds are high the composition itself will be wiped. This is the largest single issue I have.
- Linux userland tools are limited. For me this is key to productivity. Termux is very, very, very good for what it does. What it does is still limited, and not available on iOS.
- Simple basics such as keyboard-based navigation (see Mozilla Firefox/Android / Fennic/Android).
Keyboard implementation in applications themselves is at best inconsistent. This is functionality that’s as old as interactive computing (1960s and before) that is failing. In Pocket, (article archiver) I cannot use the backspace key when writing tags, there is simply no response. In PocetBook (an ebook reader), I cannot enter a space when composing a search, instead the document scrolls. There are other similar issues.
- Surveillance and monitoring are pervasive throughout the OS and applications ecosystem.
- At the OS and application level, all design intent is focused on consumption and surveillance rather than my own productivity and creation.
- Hardware support by OS vendors is limited to a few years for most owners. Google “commits” to three years, which is mostly aspirational as it may be a matter of months, if not no OS updates at all following purchase. Apple does somewhat better here.
- The touch-focused interface is imprecise and frustrating, lacking the multi-mode functionality and locational precision of a mouse-based interface. Or the blunt effectiveness of text.
In short: I can’t control my own data, I can’t rely on basic input functionality, tools are insanely crippled, decades-old UI/UX conventions are flouted, user state is at peril of being lost at any time, power-tool availability is limited, and surveillance is pervasive.
Tablets are a shit productivity environment. Full stop.
FWIW: I’m writing this on a tablet. I know its limitations well.
In another post I make the case against tablets noting that there is virtually nothing they do which isn't better performed by either a far-more-capable laptop (even a cheap and minimal one) or a dedicated device: a dedicated (and preferably dumb) phone, a still or video camera (now largely the same thing), an audio recorder. Yes, it's a backlash against "convergence", but the gains and wins are huge.
> Smartphones are unapologetically devices for consumption. In this regard they differ critically from PCs, because PCs are equal devices in the sense that the same device is used for creation and consumption. This means that anyone with a PC can create as well as consume, if they so wish. This cultural equality is diminished by an exodus to devices which can only really be used for consumption.
So smartphone companies put so much emphasis on the camera and, to a lesser extent, other built-in input devices, because...?
> Smartphones have powerful CPUs and fast network connections, except that you aren't actually allowed to use these resources in any meaningful sense, because doing so consumes battery power, and people don't want the precious battery life of their phones drained unnecessarily.
True, and this is a big hurdle for ~every attempt to gain traction with modern peer-to-peer protocols. Approximately no-one wants their smartphone battery to die in four hours flat so they can help serve decentralized YouTube on IPFS.
> Part of the “cloud” movement is probably driven by the fact that while smartphones have substantial computational resources, you can't actually use them because of battery life. So instead the computation is done in the cloud, creating a dependency on a centralized entity.
They didn't replace PCs, though. "Real" computation (by these standards) availability has continued to grow alongside smartphones. I'm skeptical how many more PCs would have been sold in a world without smartphones, and exactly how much more "creation" would be going on (though nb. I think "smartphones are just consumption devices" is a totally bizarre take disconnected from reality in the first place)
> They have ruined web design.
Heh. Yes, kinda, but then again several of the best versions of sites (that have multiple versions) are the mobile one. I'm pretty sure web design ruined web design, not smart phones.
> They are devices of unclear alignment, or of clear malevolence.
Pretty damning that the most popular and widely used computing platform we could come up with, falls under this category. If only non-mobile operating systems weren't almost all completely terrible to use, for normal people.
> With a PC, I don't have to perform some arcane operation to actually have control of the device.
Do I need to link to a billion search results dispensing the advice, "you're going to want to disable [MS virus scan / selinux / auto-updates / telemetry / et c] using [the command line / the "about" settings page / et c.] or it'll just keep [getting in your way / breaking / misbehaving]", or can I assume everyone's familiar with that?
Smartphone UIs are just too slow. Ever used a ps vita? Incredibly fast ui. Once you experience that it's kind of shocking how bad smartphones are. Texting on smartphones is a horrible experience that hasn't improved since 2008.
I suspect there's a technical reason for that, rather than pure incompetence.
The PS Vita likely devotes most of its power to its CPU and GPU hardware. Its screen resolution is also relatively low, so it doesn't take very many cycles to render.
A smartphone has a tighter power budget. For one thing, the battery is significantly smaller to keep the phone thin. It has to power the cell modem constantly. The display resolution is significantly higher, so there's one or two orders of magnitude more pixels.
The PS Vita menu is probably getting re-rendered at 60Hz in a loop that's also sampling for input events. The menu is just another "game".
The smartphone menu is probably getting indirectly rendered through an abstraction that allows the framebuffer to only minimally be updated as display elements change. Events are likely coming in asynchronously rather than being polled. The rendered image looks the same, but there was significantly more code in the path vs. the game console.
You could eliminate all that complex code and make a phone UI feel snappy like a game, but the power consumption would shoot up and drastically reduce on-time for the same battery size.
The technical reason is obvious. App development occurs on top of a massively bloated and ineffecient UI toolkit where you can dev most of your UI in xml. I would be absolutely shocked if ps vita used anything like that. They're using opengl or similar and coding the UI is going to require more coding expertise but the performance is going to be great. You could do the same on a smartphone of course, and get maybe even better perf compared to a vita, but no one does (unless its a game).
Or the opposite. Fast and battery saving UI snappiness, but atrocious I/O on background tasks up to the point of getting your messages and notifications delayed.
I don't know why you're being downvoted. This entirely matches up with my experience of smartphones. If you are running a non-top of the line Android device, random slowdowns are not unusual.
I sure am looking forward to that article, from where I'm sitting they absolutely saved web design.
Before mobile, I absolutely could not get any designer to grok that browser windows were flexible things, and most users were not looking at the web on full-screen browsers on desktop displays. The term "responsive design" didn't even exist until the advent of the smartphone, to the best of my knowledge.
Hell, we were still looking down the barrel of an eternity of MSIE 6 support until iOS Safari became important. (and the Youtube thing, of course)