I have gradually begun to take control of my own data. Having always been interested in Open Education and the extension of this to control my own data I have made a number of changes over the years to how I manage my own data & control my own technologies.

“Students – all of us really – should work to build and adopt technologies that we control for ourselves.” – Audrey Watters.

This is what I had achieved so far:

  1. Moved my blog, websites & email to Reclaim Hosting. This hosting platform was started by Jim Groom & Tim Owens and was connected with the fantastic work undertaken at the University of Mary Washington and the Domain of One’s Own project.
  2. Taken ownership of my Photo/Video storage and management. I am using a third party software service called Picturelife but I have it connected to my own Amazon S3 servers. Picture life also “sucks” any images I put on Facebook or Twitter etc on to my S3 storage too. (Not sure how long I can keep doing this as Picturelife was acquired by Streamnation, but at least I know I won’t lose the data).
  3. Set up my own secure cloud server (hosted by Reclaim) using ownCloud on my laptop & mobile devices.
  4. Automatically backing up my laptop and desktop machine to my own Amazon S3 & Glacier servers using the excellent (and highly recommended) Arq Backup
  5. Managing my own media storage and playback (films, videos & music) through Plex

But yesterday I took a step closer to owning all of my data. I am a regular (not prolific) user of Twitter, and despite some criticisms of it I actually really do find it a great platform to connect and share. However it has always frustrated me that when I post images & media via Twitter I end up hosting that on Twitter servers (usually I also noted that if you do this, Twitter reduces the resolution (I presume to limit data storage and improve mobile experience).

So I set out to explore the options to control my own media content that I shared through Twitter and this is what I came up with:

  1. You need to use a Twitter client that allows customs URL’s. I used Tweetbot for iOS & Mac
  2. I needed to find some code that I could use on my own server to get the customs URL feature to work. I am not a good enough coder to work it out for myself, but I can navigate my way round GitHub & I found this: – it’s a set of code that sits on your own web hosting server and facilitates the required protocols for Twitter to work with.
  3. I also found some tutorial info here:
  4. The problem I had was that I didn’t want to set up a new domain for this to work and it took me a while to figure out how best to facilitate this through my own domain. In the end the easiest (and most effective) option was to set up a sub domain.
  5. Once I had the domain set up I then used an ftp client to upload all of the resources that I downloaded from GitHub to the new sub domain folder on my web server.
  6. Once I had done this I could then set my custom URL in my Twitter client e.g.
  7. Now when I post images via Twitter they are hosted on my domain servers and not of twitter. This has also allowed me to share higher resolution images and longer video clips:

I am pretty pleased with this, I do not consider myself to be a web developer at any kind of level, but I am inquisitive and a bit tenacious so I think these two things combined help me to navigate my way through the more technical aspects of managing my own data.

My next little project will be to set up my own URL shortener, but that can wait for another weekend because my brain hurts!


4 Responses

  1. Impressive! You’re ahead of me. I should get on the OwnCloud bandwagon. The Tweetpic stuff is interesting; I’d not thought of it. Any media I send to twitter I usually have somewhere else first in highest res form.

    If you’ve not done so already, I recommend setting up a Martin Hawksey Genius Twitter archive, it takes the static download the twitter gives you and sets it to be updated daily (though stored in Google)

    • Thanks Alan – I’ll definitely take a look at Martin’s Twitter archive. Google is not that bad! I suppose it would be possible to use a google script to convert the data to html & host on my own web pages? I’ll have to investigate.

  2. Hi Simon,
    I’ve tried giving this a go on a subdomain. Just getting errors at the moment. I wonder where your subdomain folder is sitting? I uploaded all rthe files to a folder set to the sub domain but get a file listing when visiting the url…

Leave a Reply