.md for great justice!
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 

12 KiB

title layout tags toplevel permalink
olia lialina's turing complete user page [books internet] false /books/olia-lialina/turing-complete-user/

These summaries are going to be very, very bad. This is sort of intentional. If they spark thoughts, the credit goes to Lialina. If they seem awkward or wrong, please presume that any error is mine – and go and read her writing. She cites a lot of people and historical examples and schools of thought and this is all very important to be able to dig deeper – but I am sanding that off, mostly, in my notes, so you should want to get the real thing. There are whole chunks I'll omit that might to you be the best part.

(My plan here is to write my stilted summaries and then come back and add a million footnotes with my thoughts)

turing complete user (2012)

People want interfaces to be invisible–computers should be invisible. There aren't "users" anymore, there are only "people". There aren't "interfaces", there are "experiences". This is problematic less because it makes the computer invisible than because it makes the social situation of the user invisible; if you're a "person", you're not recognizing that you're a "user" rather than a "developer". People also have ideas now about what "user" connotes that don't match up with its history. Being a "user" meant that you were pursuing computing instrumentally toward some other goal, and that the computer itself should just be getting out of the way; you're busy with something else, and you want to think as little as possible about the computer you're using. Engaging with the computer isn't seen as "creative". The user wants to delegate figuring out the border between the creative and the mechanical, and they delegate it to developers, app designers.

Computers aren't like swiss army knives; an experience of a limited, locked-down computer doesn't represent a few functionalities glued together – it represents someone working very hard to stop you from accessing the fundamental flexibility of the computer. (Cory Doctorow, needless to say.)

Let's not think about the general-purpose computer, but instead flip it around to be the general-purpose user. Users who are going to get what they want done regardless of what an application or device was meant for - they're not locked onto the rails of the designers' choices. Software can think more about that kind of user; that user has a mindset that will fill gaps and leverage ambiguity. We shouldn't say that there's a dichotomy between users and people who understand computers – we should understand and respect users as users.

rich user experience, ux, and the desktopization of war (2014)

Tim O'Reilly's Web2.0 tried to make exciting the vision that asynchronous JavaScript requests running in the browser were going to deliver a rich experience without the user having to enrich it themselves by navigating and putting together bits. It was going to offer a lobotomized ability to create on the web without having to put together HTML. No one designed a smooth experience around having people put a little peeing-guy graphic next to some logo representing something they hated, as GeoCities webmasters did. Neocities, tildes, and superglue point to – if not a homepage renaissance, certainly lively homepage energy. People say "experience" and once you're thinking of "experience" and not interfacing with a system, you lose sight of how you can direct or customize the system and not just drift through an experience. There are a lot of ways in which the care taken around "experiences" rather than "interfaces" gets real manipulative. When you make things perfectly seamless, you end up having to get to baroque elaborations once modification comes into play – an NES controller only being able to control an iPad through controlling robotic finger-likes that can touch the iPad screen. It widens the gap between the user and the personal computer, what's really going on.

They're proposing masking what drone operators see so they don't end up traumatized. Google masks military bases in satellite imagery, a neat prearranged filter that doesn't raise pesky questions about morality, one-click operation.

We shouldn't go down the path of creating neat desktop-like metaphors and interfaces for war. We shouldn't abstract away what's going on.

not art not tech (2015)

People are trying to substitute "technology" for "computer", "experience" for "interface", and "people" for "users", and we shouldn't let them (especially not the companies). Many technologies aren't programmable, but computer technologies are – and importantly, they're reprogrammable. By letting the "computer" fade away into the larger category of "technology", we are acquiescing to a world where a company determines what code your computer (including your phone-computer) runs. Computers need to be visible.

There's a lot of important social context about art-and-technology folks' use of that language, and the same for "new media".

We don't recognize how official Apple, Google, and Facebook have become; users appeal to them as to governments for features, fairness. Which tragedies get special emoji? Which disasters check-in features?

"Take time to formulate questions that cannot be answered by monopolies or by observing those monopolies."

once again, the doorknob (2018)

Don't think that interface design comes down to laws or nature or objective reasoning – it's people's decisions, all the way down, conscious and unconscious.

Don Norman and Macintosh thinking is responsible for the use of "transparent" to mean "invisible" and "simple".

One counterexample to all the insistence that users want interfaces to be invisible, they don't want to think about them: the early web and the gloriously crunchy, textured, vivid, personality-full interfaces users created on their personal websites.

You can't say that computers aren't like other mechanical devices and that we should unthinkingly apply the goal of a doorknob. A doorknob should get out of the way so you can just "go through a door" not "use a doorknob". Computers are different. They weren't made to make things simple, they were made to allow you to construct new complications. They should help raise questions, not just provide answers.

Even doorknobs have a lot in their design worth thinking about. They're meant around human physical agency expressed through a hand; go to an uber-secure automatic door meant to be opened only by a System, and you'll understand the difference.

Don Norman, who we are going after a lot here, popularized a misunderstanding of what "affordances" are. There are implications to the ways in which this was gotten wrong; he's apologized, but it doesn't make it uninteresting.

"User experience" is a term that's wildly indefinite. It lends itself to scripted - would we say "polished"? - scenarios. Don doesn't want users having to figure out what they want from general purpose tools.

General purpose vs. designed and pre-scripted: a dating app is weird as hell because it's like a personal site, a messaging system, and some access controls. Should that require special software?

Back in the day, Apple was very thoughtful about the importance of "forgiveness", letting the user take things back. This has changed. Undo ought to be a constitutional right. But of course, if you have single-purpose applications with one button per screen, you're not doing anything where you'd need to undo anyway.

Robots are interesting, and they're raising some new and some old questions. People are asking the right kind of questions about robots that they should ask about a lot of interfaces. An interesting affordance on robots is their very anthropomorphism; a robot has "eyes" not because it needs them to see, but because it cues us that seeing is one of the things that particular robot can do.

What is the undo of human-robot-interaction?

from my to me (2021)

"Personal web pages are the conceptual and structural core of the WWW."

The moments at which they've been a thing, at which users have been in power – they were never "a time". There wasn't a Web 1.0, only retrospectively. The important people never respected amateur webpages.

"Don't see making your own web page as a nostalgia, don't participate in creating the netstalgia trend. What you make is a statement, an act of emancipation. You make it to continue a 25-year-old tradition of liberation."

No one really liked that amateurs were taping together all kind of nonsense. Everyone's neglected what they contributed to the development of the web.

"From time to time [they] mentioned artists and web artists as exceptions to the rules they established, but not web vernacular."

There's a straight line between "the rhetoric of alienation that design experts practised in 1996" and the paternalism of the tech companies now.

Maybe web designers could be "showing gnomes the way out of corporations' front yard, if I may steal Tim Berners-Lee's metaphor."

People thought that linking to others was a noble thing to do, being connective tissue in the network a role of its own.

Some people don't see the one link IG lets you have as a restriction – they don't even know that they could have that much. Others are looking for the experience that's all funneled into one shape, and they like that there aren't links. Users don't think that links are their job any more. Companies have made that happen: a lot of the time, you can point to another entity within the walled garden, but not another server...

She remembers Wordpress as an abomination that filled the web with zombie links, but some point to it as a tool of freedom. So too do people remember the freedoms of Myspace with nostalgia, yet Myspace "took HTML as a source code away from people". And yet you compare it to what there is today and you can see why people felt like coders. So will the next thing be even more locked down than Instagram?

People were building things that they built, controlled, that pertained to them. "My", not "me." But it's a sort of subversive idea, that making something means that there's now this thing that belongs to you. It's not the way that the tech interests would like you to think.

When Yahoo bought GeoCities they put in templates, and tried to cue people to build in a Me form. You can see how "About Me" went from something marginal to the top thing. Later Facebook shifted everything to the timeline of your me.

"What can be done? How to reclaim My?

Don't collaborate! Don't post your texts where you are not allowed to turn them into hypertext.

Don't post your pictures where you can't link them to whatever you like. Don't use content management systems that turn your GIFs into JPEGs. Don't use hashtags, don't accept algorithmic timelines. In short, make a web page and link to others who still have one."

"I think that leaving the platforms and meeting somewhere else is not enough, or not even the biggest deal. The challenge is to get away from Me, from the idea that you are the centre of your online presence. Don't take this imposed, artificial role into the new environments. It will poison and corrupt the best of initiatives."

user rights (2013)

userrights.contemporary-home-computing.org

There are a lot. I'm not going to write up most of them.

Undo

Securely delete my history

It should be available in a clear text format, not in a database that needs extra knowledge.

Ignore updates

Newer doesn't mean better, and better isn't better for everyone.

See the URL things are coming from

Apps shouldn't be able to hide that they're browsers!

Own data

"It's the #1 demand of the User Data Manifesto by ownCloud founder Frank Karlitschek http://userdatamanifesto.org 'The data that someone directly or indirectly creates belongs to the person who created it.'"

There are concerns about this and the comments here articulate some.

A real keyboard (physical keys)

Someone points out well that this isn't exactly the important root thing; the right to manipulate a system using tools with functionality rivaling that of the tools used by the system's developers. (Apple devs aren't coding on iPads)