Hull Blogs

Aggregated posts from University of Hull students

The UK's unified banking API is coming, and it will be great

The UK government has given support for the creation of a universal API to allow access to UK bank account data, in a move to increase competition between banks, but also to allow innovation in the finance industry to give customers a clearer understanding of their savings. The OBWG (Open Bank Working Group) established by the UK Government to find ways to improve interaction with banks in the UK, they proposed that “bank data…should be made open data” which encompasses products that they offer, to increase consumer choice. “Open APIs should be built using customer data” which would mean that apps like ‘Mint’ or ‘You Need A Budget’ would be able to use data straight from accounts about spending to break down where you can save money, which would allow for smarter spending. APIs are great, and can allow developers to integrate as can be seen by how they’re being rolled out across government (data.gov.uk). It’s important that these would be opt-in (obviously) and provided as an extension to online banking. The hardest part about implementing this, I imagine, will be the legacy systems that banks use (especially as RBS’ system alone has caused them a major meltdown and prevented two break up attempts) and creating an interface between the legacy banking systems and modern mobile apps, reducing the costs to the banks (although the old systems really ought to be rebuilt or upgraded) but this works in the interim. An official API will be unquestionably more secure than current systems used by data providers such as Yodlee and Finicity that actually log in using your credentials and screen-scrape data from your account (logging in with your credentials, loading web pages copying the web pages source and stripping financial data out), mainly the issue with this is that most banks will not protect you in the case of them having a security breach and your money being taken, as you authorised them to use your account. Despite me being sure that scrapers have good security practices, I’m certainly not risking my accounts being emptied overnight. When it’s released with initial debut meant to be early 2017, with a fully fledged product including consumer and business data being available by 2017 developers will be able to build some amazing applications upon this interface. Consumers will also get better deals on their accounts with the API allowing developers to see all accounts types that banks offer and offer the best to their customers (potentially a automatic MoneySavingExpert like system, returning the best account type for an individual based off their own spending habits). …
Read more

Why the 'Snoopers Charter' scares me

The Investigatory Powers Bill, fittingly nicknamed the called the Snoopers Charter has passed through the House of Lords this week and it’s quite scary. The policy which very much relies on the ‘if you have nothing to hide, you have nothing to fear’ sets quite a dangerous precedent (due to the breaking of client confidentiality in legal situations and monitoring of what journalists are writing about). The whole bill seems quite poorly planned, with the government requiring the ISPs to retain user data for up to a year and potentially costing ISPs up to £1 billion for retention of metadata for all customers, causing price hikes to be passed onto consumers and the £174 million the Home Office set aside to reimburse all ISPs for 10 years of collection will only just cover BT’s own infrastructure initial installation, and not actually maintaining the equipment. It’s going to be very hard to actually do anything with the amount of data actually collected though with all domains visited collected on a per-customer basis, how they are going to actually dig through this to catch criminals is beyond me. Whether ISPs actually have the competence to store this information is also a concern, with the huge TalkTalk hack of November 2015 and leak of unencrypted personal information could be devastating and the prospect of every website you’ve ever visited being shared online isn’t a great one. As well as the obvious cases where a person faces potential blackmail for this information, it could show who your email provider is, who you bank with and your utility companies are, opening you to further attacks. The only mention of encryption in the law is in the way it states from RIPA that companies must “remove any encryption” which does pretty much negate the use of encryption in the first place if it can be easily broke. If you’re letting the good guys in through a back door, chances are that the bad guys are coming in the same way. Credit: The Daily Alternative It seems that the main point of the law is legalising something that has been happening for at least the last decade with GCHQ intercepting fibre optic cables and collecting data from them, as well as expanding their reach to more information and legitimising hacking webcams and microphones to record conversations. It’s going to be a concerning time with the policy being in law, and I’m sure there will be a spike in VPN usage over concern caused. …
Read more

Digital by Default is how government should be done

The Government Digital Service didn’t just make a website, they redesigned the way user interacted with Government through user journey maps, reducing unnecessary bureaucracy and “Transforming the relationship between citizen and state” as Matt Hancock (Cabinet Office) said. By removing the need for 320 separate government websites and merging them into one different domain they saved £3.5 billion in overall costs to the taxpayer since 2012. This set an example worldwide with the UK winning the best E-Government website and the UK’s open source codebase being implemented by GOVT.NZ and the US Digital Service. The removal of the need to go to the Post Office to renew your tax disk or send off for, fill in and send back a pack of forms to renew your passport is really great. Government agencies such as the DVSA have really embraced the opportunity to go digital first with services allowing you to hire cars by proving your driving licence online and checking a vehicle’s history for free before you buy it just with the registration and manufacturer. Previously the DVSA had contracted out all of their work to Fujitsu for 15 years and after moving back in they’ve reworked all their customer-facing services in a more efficient way, showing that in house development really does work better. The latest project that the GDS are working on is register creation, creating lists that other government platforms can be based upon and allowing services to share data in one system as opposed to hundreds of different ones around government such as open registers (such as open electoral), closed registers (such as land registry) and private ones (such as organ donor status). The process of researching, testing and then going back to the drawing board is all described on the blog and it’s great to see that process. Unfortunately, not all of the government agrees that GDS is doing a good job, and would much rather it would be put back to the way it was before. The old head of GDS Stephen Foreshew-Cain was replaced by Kevin Cunningham (who wrote of £1.5 billion on the Universal Credit IT programme failure). According to inside rumours there was a ‘minor coup’ over summer and now there is a stand off between the DWP and HMRC who were amid their own transformation projects at the start of GDS and would like to see the agencies go back that way, which is a worry. If GDS were to be dismantled, I think we’d see a return to the old ways of Government IT, contracted out for hundreds of millions of pounds more than the contracts are actually worth and providing less of a return on investment, it will also oust people at the service currently working in a more effective way that users actually like. I’ve followed the GDS block for the last three years and it’s been intriguing to see how they move from concept to production with care at each step, even moving to write ‘plain English’ to increase accessibility (even in HMRC to make the system actually understandable to the average reader) which are realistically not going to be developments you get when contracting to the lowest bidder. The NHS are starting their own programme with a £4bn redevelopment programme following on and using GDS’ prototype system, the pushback they will likely face going digital by default across the entire of a disconnected health service is likely going to be huge though. …
Read more

Have a continuity plan

From working at the Library, I’ve learned a lot of things, helping people from all walks of life with more different queries than I could count. Unfortunately, something that has affected a surprising large group of people I’ve met are death of a loved one and wanting access to their data. Modern systems are designed to be secure which can be double edged blade. They’re designed to protect people from being impersonated through multi-factor authentication (something you know, have or are) although they have weak points such as the swap-sim method but this enhanced security can end up locking out loved ones if the worst happens. The most common situation I have had is with Windows laptops (mix of 7-10) where family have passed and people want to see their photos or files from their laptop which is sorted easily with Hirens (if legacy BIOS) or the sticky keys command prompt method (if UEFI boot), the problem here is that if people have BitLocker or FileVault enabled I can’t help them (which is on the other hand great for stopping thieves), with Google having a process to request access but if it’s an Apple device you’re out of luck as they don’t unlock devices for security reasons. I’ve had several people with iPads which are activation locked by the deceased’s Apple ID so cannot be used again nor purchases be accessed (making the device a very expensive paperweight). On the other side of this, devices becoming harder to break into is good as if your device falls into the wrong hands it’s safe, and you’re protected from being forced to hand over data if your device has been seized (or copied at a border). If you have important files on your devices (sentimental or otherwise), record your passwords somewhere for in case the worse happens. If you use a password manager (like 1Password, LastPass or KeyPass) the easiest way to protect yourself from disaster is to create yourself an emergency kit and lock it away in a safe or filing cabinet which can be accessed if something was to happen to you. Be proactive in these situations, don’t leave it to others to be reactive. …
Read more

Nightmares from upgrading Ubuntu 14.04 to 16.04 LTS

This week in my half term; I decided that it would be a good idea to update my home server (an Intel NUC) from Ubuntu 14.04 to 16.04 LTS but the update wasn’t as straightforward as I had hoped (it never is!) To update I thought I’d use SSH, but after receiving warnings in the terminal that if the session dropped it would cancel my update I decided to use RDP (through xRDP) instead. This started off great and the update was running well, until my session dropped and was lost by the RDP server (I couldn’t find it in the logs to reconnect to). I then was unable to get back in and SSH connected after a good 20 minutes of waiting. It seemed that the update had halted but DPKG was still locked, this command sudo rm /var/lib/apt/lists/lock removed the lock (thanks AskUbuntu). I attempted to resume the install using sudo apt-dist upgrade which I had used previously but was told I was already up to date on 16.04. After another few questions on StackOverflow I found apt-get update,apt-get -f install,apt full-upgrade should return me to a working system, in total they took about a day to run, but eventually got going again. The first thing that wasn’t working when I got back in were my network services which had all stopped, I got Apache, Plex, and the others running again before noticing that my NAS drive mounting had failed with mount.nfs: an incorrect mount option was specified, it was previously: 192.168.1.3:/volume1/photos /home/serviceusr/Desktop/photos nfs ,username=serviceusr,password=passwordhere,_netdev,rw,hard,intr,nolock but after following some tips I got it working again using: 192.168.1.3:/volume1/photos /home/serviceusr/Desktop/photos sec=sys,intr,rw,vers=3,timeo=11,auto,async,bg 0 0 Docker also didn’t work after updating, I had to re-add its repository (disabled in the update) and install it again, the best guide for this is here. Previously I had run all my containers by using --restart=always on a container but this didn’t survive the update, instead I added them into my /etc/rc.local file for the future, like this: docker run -p 8443:8443 -p 8080:8080 -p 8081:8081 -v /var/unifi:/var/lib/unifi -d jacobalberty/unifi:latest I also decided that I wanted a better backup system, as Syncthing on my Synology just wasn’t cutting it (running out of memory and restarting and stopping every 15 minutes) so I mounted the folder containing the users home folders with fstab on my server and then ran: curl -s https://syncthing.net/release-key.txt | sudo apt-key add - sudo echo "deb http://apt.syncthing.net/ syncthing release" | sudo tee /etc/apt/sources.list.d/syncthing.list sudo apt-get install syncthing Then had to run these, replace user with the account you want on the server: sudo systemctl enable syncthing@$USER.service sudo systemctl start syncthing@$USER.service This failed for me, and it took me ages to work it out with some vague errors, turned out I didn’t own the directory or executable, so run: sudo chmod 777 -R /usr/bin/syncthing sudo chmod 777 -R /home/USER/.config/syncthing/ The web interface should then be available on server:8484, the SSL certificates should be changed (if you don’t like security errors), they are /home/USER/.config/syncthing/https-cert.pem and home/USER/.config/syncthing/https-key.pem, I just symlinked them to my apache directory (easier to change my wildcard when it expires that way). And that’s me up to present, hopefully this helped you if you received the same errors! …
Read more

Grav is awesome!

I have recently moved my home internal website from OctoberCMS to Grav, a cool new CMS without a database. Both also have clean backends, but I fancied a change in CMS and it seemed like the best option. OctoberCMS is nice but I like the native ability to write markdown in Grav, its cleaner layout and how it’s far easier to install (I had many database issues the first time I installed October), twig templates are also great and something that I’m used to from Flask and allow me to create pages far more quickly, but are not something I need regularly as the site is pretty static. I didn’t add any plugins beyond the basic ones either (OctoberCMS had more plugins, but most were paid-only), but didn’t need them for this site. Writing pages manually is great too, as it’s pretty much the same as Jekyll (used for this site) with the configuration options at the top of each file. The only disadvantages that I can see of Grav at the moment is you can’t get too complex with it as there isn’t a database, but the ability to back up the entire site by simply copy and pasting the folder is awesome (and great for Git versioning too). Obviously web design varies, and both allow full customisation on how your site looks, but for reference (left and right are old and new respectively): …
Read more

Lessons learned from running Cat6

From about 2008, we’ve used Comtrend Powerline Adaptors (due to them being shipped free with BT Vision). I’ve never really been a fan of them as if something spontaneously stops working, you can bet they’re the problem. With us getting the new UniFi APs and ceiling mounting them, running cables to each we thought it would be the best time to run cables to the rest of the house too. Dropping cables was as more of a pain than I thought it would be (old house, solid walls, joists running against us) but so far I’ve managed to connect up the key rooms. Advice I’d give though from this is to get a cable access pole kit (£10 as a Aldi specialbuy woo), they’re invaluable in finding concealed blockages and navigating down walls, also run more cables than you think as with HDMI over Cat6 available you may use cables for other purposes. I terminated all of my cables in the loft into a 16 port patch panel and with a faceplates in the house, but I’m not done yet as I’d like to future-proof the whole house by putting drops in every room where future devices could potentially go. My patch panel positioning still has something to be desired (and I need to find a way to mount my Ubiquiti POE injectors) but apart from that it’s awesome to have a network that isn’t horribly sluggish! …
Read more

The 'Smart Wallet' is still a fantasy

Passbook was released with iOS 6 in 2012, promising to be a more convenient way to store vouchers, loyalty cards, coupons and event tickets. With the release of the iPhone 6 in 2014, Apple Pay came too (first to the US, then UK and other select countries) and introduced a new way to pay. In the UK, NFC has been a standard in the UK since 2007 with merchants rolling out support at differing speeds. Apple’s launch of Apple Pay in the UK in July 2015 was something I thought was going to be truly market changing, but it wasn’t. That was due to two reasons; lack of supporting banks and the facts that wallets (or purses) aren’t just home to credit and debit cards, they also home a ton of loyalty cards (usually a Booths card if you’re northern and like free coffee), ID and membership cards. Loyalty There isn’t a loyalty card platform either, with Apple introducing loyalty in 2015, it wasn’t really adopted widely which is likely because of how much of a pain it is for merchants to introduce them. If you make a digital reward card for iOS, you’re also going to have to make one for your Android customers too, which is just an inconvenience. I think that if there was an open standard in place for id cards, like the .pkpass files used previously for passbook barcodes and flights, it would mean that more stores would be more likely to implement this for their own loyalty systems, currently passbook’s standard is proprietary and requires a developer account to sign the pass (although this makes Apple some money, it’s likely prohibitive to some stores). A format that could have a loyalty number passed over NFC, requested by the POS machine would be ideal. So in addition to the card info being passed the loyalty number could also be given. This could be provided from a file that contains store info could be in a similar format to the one below, and could work on any platform: Store loyalty card   Store name Booths Store logo Store logo image Store header image Large image of side of card Customer name John Smith Background #444C3F Foreground #FFFFFF Loyalty number 003602 URL of customer system siteurl/loyalty/003602 The system could then respond at the url a response with the number of points, which could be then displayed in app. The problem is with this, it’s unlikely to be ever implemented by Apple or Google as they have far too much interest in retaining userbase, an easily movable wallet would break that ecosystem. So here's a little prototype of something we're working on #drivinglicence pic.twitter.com/a5eItrdiNI— Oliver Morley (@omorley1) 13 May 2016 Identity Realistically, we are not going to lose a wallet any time soon, especially for those who look under 25 as they’re likely to be ID’d whether going to a club or just buying alcohol from the supermarket. In May this year, the DVLA’s CEO posted the tweet on the right, showing the future of an ID which would be much better. Most people born in at least the last 20 years carry a phone constantly so being able to carry the driving licence would be incredibly useful. The flip side of this issue is that it would be incredibly easy to commit fraud so it would require another system like the one above, with a one time use code where a number is displayed beneath the ID (and in QR form) and can be queried by pasting into a government service which would return the information shown on the card with an image (this could then be implemented within an app for bouncers and the police and allow data to be validated). The one-time use code changing would prevent those who look up from returning to the ID after validating and would stop fake IDs being produced with the same value, if accepted instead of plastic ID too this would entirely prevent fake ID as you can’t spoof a gov.uk domain and valid HTTPS certificate (well, not without significant difficulty, changing their local DNS and becoming a trusted intermediate CA). Membership cards Membership cards and access cards are also struggling to get with the times. They could do with a system like the one I mentioned for a supermarket but use the devices build in NFC card to pass the loyalty number, replacing the current magnetic strip system used at many gym clubs for cards. Allowing membership cards to use NFC could also work for work ID systems which would be more secure than just possessing a card as it requires the fingerprint. Membership card   Membership name Gym Limited Membership logo Membership logo image Membership header image Large image of side of card Customer name John Smith Background #FEFEFE Foreground #B30098 Loyalty number 00354322 URL of customer system siteurl/member/00354322 Concluding Overall, I think that virtual wallets still have a lot more potential for the future than what they’ve shown so far and I’m excited for that, however I’m a strong believer that open is better when it comes to standards and if technology companies would just share there would be far better offerings for the community. …
Read more

Is AI a threat to mankind?

Yesterday, I visited the University of Liverpool University (Heseltine Institute) Policy Provocations seminar on Artificial Intelligence to aid my extended, currently with the working title ‘Artificial Intelligence: Friend or Foe?’. I found out about the seminar and decided it would be a great way to widen my knowledge of the area, and to get some further opinions on the technology’s ethics. It was chaired by Dr Roger Phillips (BBC Merseyside) and the panellists were Professor Simon Maskell (Liverpool engineering/computer science), Joanna Bryson (Bath University Natural Intelligence) and Sir Robin Saxby (ARM holdings). They debated whether they were excited or scared by the future of AI, talking about logical and emotional intelligence, can they solve prejudices and do we trust the machines themselves, are the risks actually covered (why does society mistrust). I got some great notes and it was a really interesting evening! Seminar full video - External Video Player from liv.ac.uk …
Read more

Home is not the enterprise

Recently when I set up my UniFi UAP access points, I exchanged my old WPA2-Personal network for WPA2-Enterprise thinking that it would be simpler for my family and more secure. However, I quickly hit some unfortunate snags that have led me to revert the network back to WPA2-Personal; firstly how lacking consumer devices are was my main issue with devices like Chromecasts and game consoles not supporting the standard, but that was fine as I could just set up another SSID for these devices with a long and random password. However when I configured the RADIUS server on my Synology NAS to be used for it and moved everybody across we hit the issue that the authentication would hang for roughly 30 seconds every time the device roamed or left and returned to the house. I couldn’t find a better LDAP/RADIUS server that works on Linux with a good web interface (if you know one please let me know). However I’d rather not spin up and have the overhead of a VM of Windows with Active Directory. I have learnt that although a technology may be easy to implement and work for myself, I have to design the best system for all of the users which is near enough zero-management and works relatively well unattended. …
Read more

Where virtualization shines - observing YHA

Last week I was on residential at a YHA hostel in the lakes, volunteering to help out with tasks and improve their facilities, it was a great trip and a break from the internet (as the hostel didn’t have WiFi available). However; despite not having networking access for the public or staff, the POS desktop was still somehow connected to the internet, with what appeared to be GuestCentrix running, a contactless payment machine and access to email. It appeared when logged in there was windows-inception with the local machine’s Windows 7 taskbar sat below the Windows 10 taskbar, showing that it was somehow connected, the blue status bar at the top gave away it was Microsoft’s remote desktop. This move has also been done by the Australian YHA (documented here) and for businesses like the YHA it’s a great idea due to the many of their properties being located off the beaten track and not reached by FTTC schemes (the nearby pub’s wifi stats: download - 0.5Mbps, upload: 0.11Mbps, ping - 167ms), if all the PCs were set up individually it would’ve taken days to download Windows 10 for example. Another advantage is keeping corporate software up to date, when the programmes used are updated they only need to be changed in one place as opposed to having to update them in their 200+ locations across England and Wales, as well as from a security perspective if you get theft of computers the data is secure as it isn’t on the device (assuming that the RDP session isn’t left unlocked of course) and customer data gets an extra layer of protection. There are many situations where just using RDP is a bit of crap idea, like when my school previously used it for all computers (even where students were using graphically intensive programmes and watching videos) which it just couldn’t handle; but in the YHAs situation for just reservation applications and basic searches it is the perfect technology. …
Read more

Unifi-ing my home network

After using a TP-Link AP for four years as a range extender for a BT Home Hub, I decided enough was enough. After two years, I decided to make the swap to DD-WRT which made the situation more bearable, despite there not being a compatible build for it. I took a leap in the dark and flashed it with the firmware for a similar access point by TP-Link, I got myself an almost zero-handoff functionality by making it spoof the BT router’s MAC address but it still needed a script to reboot it weekly, which would stop most of the random drop outs. It served us well though, and brought us up to March. In March, we spotted some UAPs on eBay, stripped out of offices at great price, so we bought two and set them up at opposite ends of the house, the actual adoption process confused me at first but I got the hang eventually. After getting the controller up and running in a docker container; managing the APs with the controller and setting up the network SSIDs, everything was working great. I decided to try WPA2-Enterprise, using my Synology NAS as my RADIUS server which didn’t work great with random drop-outs but when I switched back to WPA2 personal after a month everything was running smoothly again. These APs are pretty powerful and handle the handful of clients we have with ease. Where Ubiquiti excels even more though is their controller software which is miles ahead of anything else I’ve seen, allowing everything to be controlled through a single UI and also having a mobile app. I can see if an AP has dropped, client data usage and signal strength as well as a glance-able view of my network. I’m yet to found any real downside of using the Ubiquiti Access Points, beside their lack of 802.11AC but really we’re not really reaching the speeds of Wireless N (no FTTP yet) and the area around our house isn’t particularly congested on the channels. I’d certainly recommend the APs as once they’re set up, they pretty much manage themselves. If you’re confused setting up the UniFi AP this too, download the Unifi-Discover app from their website. Reset your Unifi using the pinhole on the back and connect it to your network It should appear in the discover app, press ‘Manage’ and inform it of your Unifi Controller (running live on a Raspberry Pi, if you want it to be always on) Use its IP and port (usually 8880 or 8081) as http://example.com:8880/inform. …
Read more

Programming Day with Harry & Nightmares with Ruby

I recently met up with Harry (@harryb0905) and we spent the day experimenting with a BlinkStick and a BBC Micro:Bit, which was pretty cool. We tried to get Jekyll installed on Harry’s laptop, which caused a bit of a hassle as it turns out when installing Ruby through Homebrew we didn’t account for the fact that MacOS itself runs a version of Ruby. After ages of fighting frantically with the error below (first dealing with the fact that OSX 10.11 brought System Integrity Protection), we realised we could solve it by installing RVM (Ruby Version Manager). Harrys-MacBook:Harry-Site HarryBaines\$ jekyll -v /System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/rubygems/core_ext/kernel_require.rb:55:in `require': cannot load such file -- bundler (LoadError) from /System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/rubygems/core_ext/kernel_require.rb:55:in`require' from /Library/Ruby/Gems/2.0.0/gems/jekyll-3.2.0/lib/jekyll/plugin_manager.rb:34:in `require_from_bundler' from /Library/Ruby/Gems/2.0.0/gems/jekyll-3.2.0/exe/jekyll:9:in`<top (required)>' from /usr/local/bin/jekyll:23:in `load' from /usr/local/bin/jekyll:23:in`<main>' But once you install RVM, everything becomes plain sailing we discovered sudo curl -L https://get.rvm.io | bash -s stable --ruby then gem install jekyll and you’re done. Blinkstick script I’ve had a BlinkStick for a while, but hadn’t really purposed it but together, inspired by the BlinkStick documentation we wrote a script which authenticates with Gmail, then checks every 30 seconds to see if there’s any unread mail, if there is the LED will flash red three times. The script we created is below: https://myaccount.google.com/security#activityimport urllib import feedparser import time from blinkstick import blinkstick username = "example@gmail.com" password = "gmailpassword" \_URL = "https://mail.google.com/gmail/feed/atom" bstick = blinkstick.find_first() greenCol = "#0AFC12" class my_opener(urllib.FancyURLopener): def get_user_passwd(self,host,realm,clear_cache=0): return (username,password) def auth(): '''The method to do HTTPBasicAuthentication''' opener = my_opener() f = opener.open(\_URL) feed = f.read() return feed def readmail(feed): '''Parse the Atom feed and print a summary''' atom = feedparser.parse(feed) record = len(atom.entries) print("") print(atom.feed.title) print("You have %s new mails" % len(atom.entries)) if bstick is not None and len(atom.entries) > 0: bstick.morph(hex=greenCol) return record if **name** == "**main**": while True: f = auth() valOld = readmail(f) starttime = time.time() time.sleep(5.0 - ((time.time() - starttime) % 5.0)) valNew = readmail(f) if valOld == valNew: pass else: bstick.pulse(name="green", repeats=3, duration=100) bstick.morph(hex=greenCol) starttime = time.time() time.sleep(5.0 - ((time.time() - starttime) % 5.0)) f = auth() readmail(f) …
Read more

Why I use 1Password

I first started with 1Password back in 2013, moving to it from a physical password book which was stuffed with scribblings on different notepad paper with credentials on and shoved into the book. I discovered the app on Reddit, if I recall being highly recommended, so I downloaded it for the £12.99 it was at the time (a hefty sum) but unquestionably worth it. You can see to the my phone, old iPod Touch and the book I used to use, a reduction in bulk and an increase in portability. At the time I moved, I still used Windows so kept my passwords by my side, ready to key in manually. Today, I’m storing 500-ish items in my vault, a mix of logins, notes, IDs and more, and most importantly, I feel safe about it. Despite the app being closed source, their openness and availability to explain individual aspects of the software is unrivalled. There are fully open alternatives available such as Keypass but honestly, I trust AgileBits more as their software is updated almost daily (through the beta programmes) and the apps are made by them, not ported to different OS’ by shady third parties. 1Password has protected me from Heartbleed through its watchtower feature, alerting me which passwords needed to be changed, so I could simply go to the website, randomly generating a new password and saving it in 1Password. It moves the duty of remebering these away from me, I only have to remember the 1Password for my vault (although this should still be changed regularly). I can only talk about AgileBits in the highest regard, in the times I’ve spoken to them, they’ve responded within hours and found a quick resolution. There has however been some negative press about 1Password, in November 2015 with their old 1PasswordAnywhere format not encrypting metadata, which was a bit of a worry. This is as it meant the domains and titles of sites stored were not encrypted. However within hours there was a solution available, and the AgileBits support staff were on hand on Reddit and other social networks to let people know how to migrate if they wanted to do so before its officially patched. One company controlling everything in this respect works really well, as if a fundamental flaw was found in the keypass file format (for instance) it would take weeks, months or more for all the developers of different keypass apps to update to the new format (assuming that all the apps were still supported); this co-ordination I feel really gives the edge. Alternatively, I could’ve chosen LastPass instead, but their recent absorption by LogMeIn doesn’t fill me with hope for their future. I also don’t like using their cloud infrastructure to handle my syncing, I’d much rather WiFi or file-sync myself. …
Read more

DofE Gold Award

I recently undertook my DofE gold with my Sixth Form. My final expedition was 4 days walking the Yorkshire Dales and the walk was challenging, but certainly worth it! Here’s a video of how it went, edited by Shuck Productions and the intro made by myself (in After Effects): …
Read more