For whoever is reading this: you should buy the "A people's history of computing in the United States" by Joy Lisi Raskin -> http://www.hup.harvard.edu/catalog.php?isbn=9780674970977 .
I will be updating this channel to reflect who truly was a computing pioneer versus who the commercial narrative branded as.
"In contrast, as a post-communist subject, I cannot but see Internet as a communal apartment of Stalin era: no privacy, everybody spies on everybody else, always present line for common areas such as the toilet or the kitchen. Or I can think of it as a giant garbage site for the information society, with everybody dumping their used products of intellectual labor and nobody cleaning up. "
From Mark Wigley's 'Network Fever'
All the webs that proliferated in the sixties, including Doxiadis's City
of the Future, were likewise an attempt to establish a physical image of
the invisible space of electronics, even if electronics itself is not discussed. All the projects by Tange, the Metabolists, Team X, Archigram,
Constant, and others were practical and even took their character from
their engagement with the pragmatics of construction, but they were
first and foremost polemical images-and were presented as such. It
matters little that virtually nothing from all those experiments was built.
Or, to be more precise, what was carefully built was a set of images that
remain polemical today, a commentary on the networks we already
inhabit rather than a dream of a future world.
It took decades to forget such experiments so that a new generation
could present itself as the first to engage seriously with the architecture
of electronics. Much of what we hear today is an echo-but so delayed
that it sounds fresh. It is as if the discourse forgets its own history precisely because it is too afraid to leave those earlier positions behind.
Supposedly avant-garde visions manifest the discipline's greatest fears.
Something went wrong trying to save https://fasterthanli.me/blog/2020/a-half-hour-to-learn-rust/.
Image content type is invalid, Image is invalid
Been around for a while (since 2013 in Chrome)
Way to directly connect browsers (clients) to each other
SIP—earlier model of doing this—shoved into HTTP and JSON
Inside of WebRTC it's all these weird formats from the 90s
Asymmetric communication: Simple case
Browser 1 will make an offer
Browser 2 gets the offer and writes an answer, sends that back
They use the information in the offer and answer to connect
Candidate IP addresses
IP addresses browser thinks it will be reachable at, gotten by checking interfaces, will include local addresses (candidates)
try each and find ones that work (are reachable)
How are the offer and answer transmitted?
Use a shared server that both can reach to transmit offer and answers
Usually done with websockets but not required
Only used in establishing the direct connection
Unreachable network case (NATs)
If both are on separate private networks, no candidates are reachable
Use NAT reversal technique called STUN (part of ICE protocols)
STUN server (separate from signaling server)
Browser reaches to endpoint and asks for its (publically routable) IP address and figure out how it's NATted (what port is it "actually" on)
A bunch of public ones exist run by large scale third parties (Google, for instance, or anyone running VOIP)
Uses that as part of its candidates
No need for both clients to use the same STUN server (or even for both to use a STUN server, if one side is routable)
Some NAT configurations might not allow an alternative machine to ingress on the the NATted port number (i.e. the source must match the destination the connection was initiated to)
You can ask for a relay to resolve this (TURN server)
It will return a port number an an address that works as a proxy to connect to you
Google stats: ~8% of connections are strictly NATted and require TURN
Geographically distributed to reduce latency
Up to 4 or 5 connections, often just do many-to-many connections (each client connected to each client)
This stops scaling fast (factorial number of total connections!)
SFUSTU server acts as a mixer and can take uploads and stream them back now
Adds a bit of latency but is usually good enough
“Selective Forwarding Unit”
Could a client act as the "server"?
Seems reasonable, may be a connectivity issue
Daisy chain model, where each peer passes it along to the next?
Clearly a more latency driven-model
Can we determine available bandwidth?
WebRTC tooling might not enable it but Skype once worked this way
Got Skype kicked off CERN networks
If it decides to make you a supernode, RIP internet connection, no choice
Hence Skype is no longer working this way
It is artists-as much as museums or the market-who, in their very efforts to
escape the institution of art, have driven its expansion. With each attempt
to evade the limits of institutional determination, to embrace an outside, to
redefine art or reintegrate it into everyday life, to reach "everyday" people
and work in the "real" world, we expand our frame and bring more of the
world into it. But we never escape it.
Of course, that frame has also been transformed in the process. The
question is how? Discussions of that transformation have tended to revolve
around oppositions like inside and outside, public and private, elitism and
populism. But when these arguments are used to assign political value
to substantive conditions, they often fail to account for the underlying
distributions of power that are reproduced even as conditions change,
and they thus end up serving to legitimate that reproduction. To give the
most obvious example, the enormous expansion of museum audiences,
celebrated under the banner of populism, has proceeded hand in hand
with the continuous rise of entrance fees, excluding more and more lowerincome
visitors, and the creation of new forms of elite participation with
increasingly differentiated hierarchies of membership, viewings, and galas,
the exclusivity of which is broadly advertised in fashion magazines and
society pages. Far from becoming less elitist, ever-more-popular museums
have become vehicles for the mass-marketing of elite tastes and practices
that, while perhaps less rarified in terms of the aesthetic competencies
they demand, are ever more rarified economically as prices rise. All of
which also increases the demand for the products and services of art
`The “abstract machine” is Deleuze and Guattari’s term for the sum of all machines—in their terminology, this includes the body, society, language, interpretation: like the rhizome it stands both for the sum and its parts. So the network too is one of these abstract machines: a mainframe, a terminal, a laptop, a wireless LAN, a string of satellites. And us too, living inside the machine, a part of the network.
That notional space.
The interface thus trains the user into a specific performance. When interfacing with any technology, users subtly and often subconsciously modulate their own behaviours based on the response obtained — which gestures are understood, which hashtags gain traction, which photos become promoted. Users learn what is understood and what is ignored with any particular system; they adapt their practices to make them technically legible; and they refine their strategies based on the results obtained. This iterative cycle of reorientation for maximum recognition is what Tarleton Gillespie calls “turning to face these algorithms” (2014: 184). In a world in which interfaces driven by algorithmic logic increasingly mediate our everyday, this skill becomes critical. From facilitating friendships (Facebook) to getting hired (LinkedIn), massaging your credit score (RevolutionCredit) or maintaining your rating (Uber, Airbnb), the ability to sense what an interface “wants,” and adjust accordingly is key.
Galloway stressed (2008: 947) that existing theories of the interface that only understood it as a palimpsest “can only ever reveal that the interface is a reprocessing of something that came before.” As he alludes, the interface is not simplya set of inscriptions written onto a static object, nor just a fossilized configuration of past practices brought together into a particular media form. Rather, the interface is better understood as a generative performance taking place in the present, a performance intersecting with elements outside its original remit: culture and capital, gender and history. In other words, an interface does not just register the conditions of its own production, but also actively reinscribes them back into the world in specific ways: reinforcing a relationship to the commodity, formalizing a feminine-technical understanding, supporting a particular sexual subjectivity. In connecting, bridging and mediating, the interface is simultaneously shaping. If the interface is a fertile nexus, it is one that is both lively and can affect our lives. Alexa poses an important and ongoing question about what forms that nexus should take.
(from previous link)
Much of how games have been theorized is based on the idea of the separation of the game from life. And this separation has to do with the rules of the game and the means of producing and policing the game space. The interface similarly operates according to a separation that produces another state. In a game, one voluntarily assumes the limitations imposed by a set of rules in order to be incorporated within game play. This sense of freedom within already given constraints is very much like the experience of using an interface. Given this, it’s not surprising that the history of interfaces has been intertwined with that of games.