I’m working on the concept of a situated media player whereby embedded media functions as a webring, cross linking laterally between different sites in an acti of solidarity opposing a centralizing web 2.0.
Sad to see this. Interesting to see the comments on the post itself. Amazing that old comments will simply be discarded (or am I misunderstanding?).
Forked Version: https://github.com/automatist/thank-you-github
Text copied 15 Jan 2016
Before 2007, the way to participate in Open Source was fragmented. Each project had their own workflow, patches circulated in emails, issues were reported in a myriad ways, and if anyone wanted to contribute they had to figure out every project’s rules.
Free software, since the GNU project started in 1985, and in the many surrounding waves of activity, has been a place for a diversity of practices, programming styles, opinions, time schedules, personalities and tools. And that’s a very good thing. This diversity is the core to Free software’s strength. Social participation is often messy and rightly requires an investment of time and consideration of others.
Then, a handful of people took the challenge to build an awesome platform and as a consequence of their hard work, their platform earned its hegemony.
Handful of people? Right, standing on the backs of the community that produced git (for which I feel truly thankful), the history of alternative version control systems and web platforms, not to mention then all the developers whose contribution to github in the form of entrusting it to manage their code has made it as valuable a commodity as it is today.
Hegemony, indeed. Github represents a cultural hegemony in software development today, and as such is actively displacing and distorting the practices of the very community that helped to create it.
Nowadays doing Open Source is infinitely easier thanks to you, GitHub. You’ve provided the tools and the social conventions to make those days a thing of the past. Your impact in the Open Source movement is unprecedented.
Social conventions? Like enabling a culture of hostile forking rather than collaboration? Check out the story of Natacha Porté and libupskirt / libsoldout and give this thoughtful essay from Aymeric Mansoux a browse.
We want to express our gratitude for all you’ve done and do for Open Source.
And please, may you stop doing it.
GitHub, thank you very much.
As Lily Allen would say, GitHub, fuck you very much.
Continuing the spirit of documenting those little recipes I find myself continually drawing upon, a makefile for turning a directory of markdown sources to HTML using pandoc and simply by typing make.
# find all .md files in the directory mdsrc=$(shell ls *.md) # map *.mp => *.html for mdsrc html_from_md=$(mdsrc:%.md=%.html) all: $(html_from_md) # Implicit rule to know how to make .html from .md %.html: %.md pandoc --from markdown \ --to html \ --standalone \ --css styles.css \ $< -o $@ # special rule for debugging variables print-%: @echo '$*=$($*)'
Note the assumption of the presence of a stylesheet named styles.css.
I have been using a git-based web publishing workflow now for a few projects, and now have finally hooked it up to my personal site (automatist.org). For my own documentation, I thought I’d write down my take on what has proven to be a very useful tutorial: Joe Maller’s “A web focused git workflow“.
Basically what Maller’s setup explains is how to create git publishing relays so that pushing changes from one (local) repo gets relayed automatically to another, for instance a live online website. The trick is that you need to use a bare respository that acts as a “hub” allowing changes to be pushed from any of the satellites.
Turns out, you really need a bare repository as these are the connecting elements of git. You can’t (or at least shouldn’t) push to a repo with a working directory (ie a git repo that actually has it’s files “exposed” as a regular directory with files). Instead bare repositories are like inside out folders, where the tender files are hidden away, and whose main purpose is to push to and pull from.
Slightly different from Maller’s method, I usually start from an existing repo, that may or may not already have a remote. To start, the steps I usually follow are:
which means something like:
cd /path/to/mygitfolder cd .. git clone --bare mygitfolder mygitfolder.git scp -r mygitfolder.git firstname.lastname@example.org:git/ cd mygitfolder git remote add myserver email@example.com:git/mygitfolder
Then on the server, I clone from the hub into a “live” working directory that’s publically served by my webserver. Caution: At this point the contents of the .git are publically accessible (which means that all versions and commit history could be accessed, which is not what I want). See the end about how to fix this.
ssh firstname.lastname@example.org cd /var/www/ git clone ~/git/mygitfolder .
Finally I create a post-update hook script in the hub (triggered after things are pushed to it) that automatically steps across into the “live” repo and pulls the new changes from the hub..
cd ~/git/mygitfolder.git/hooks cp post-update.sample post-update emacs post-update
and edit this to:
#!/bin/sh cd /var/www/vhosts/automatist.org/httpdocs || exit unset GIT_DIR git pull origin master # though it seems wrong to me; this must come after the above exec git update-server-info
Now, when I commit changes and push from the initial repository, I see “remote” messages that show the update propagating to the live repo.
Counting objects: 3, done. Delta compression using up to 4 threads. Compressing objects: 100% (3/3), done. Writing objects: 100% (3/3), 303 bytes | 0 bytes/s, done. Total 3 (delta 2), reused 0 (delta 0) remote: From /home/myusername/git/mygitfolder remote: * branch master -> FETCH_HEAD remote: Updating d54b302..a3f49d3 remote: Fast-forward remote: index.html | 2 +- remote: 1 file changed, 1 insertion(+), 1 deletion(-) To email@example.com:git/mygitfolder d54b302..a3f49d3 master -> master
To fix the visiblity of the .git folder, there are a number of things you can do (as usual), I ended up just changing the permissions of .git:
cd /var/www/ chmod 750 .git
Part of the ongoing transition of big commercial online platforms away from a promiscuous and porous web to a “gated platform” and “developer sandbox”. Of note is the language of “authentic experiences”, “refined resources of the API”.
Only a tiny fraction of Instagram feed reading happens in third-party apps, so Instagram is shutting down its feed API to make feature development nimbler and create a more consistent user experience. The move is part a big cleanup of the Instagram platform. It involves listing exactly what?s allowed with its APIs, an app permission review process that makes Instagram a gated platform, and a new developer testing sandbox.
The changes will hit developers of Instagram clients hardest, especially those for platforms Instagram doesn?t natively support like iPad and desktop. Apps that will have to change by June 1st include Retro, Flow, Padgram, and Pictacular for iPad, and Webbygram, Webstagram, Instagreat, and Itsdagram for desktop. The new policy will also strike down any service offering auto-following, -liking, or -commenting.
In return, Instagram says this will ?set up a more sustainable environment built around authentic experiences on the platform? for developers in the categories it does allow. Essentially, crappy apps won?t burn users, detracting from good apps. Plus, Instagram will be able to devote more resources to support the refined set of APIs it does offer.
Using hashes to provide “distributed image hosting”?
After a long gestation period, Writing and Unwriting (Media) Art History: Kurenniemi in 2048 has been published! Edited by Joasia Krysa, and including a contribution written by Geoff Cox, Nicolas Malevé and myself about the Kurenniemi digital archive. (You can find view the original working version of the contribution on the AA wiki.)