Hull Blogs

Aggregated posts from University of Hull students

How 'smart' are smart meters?

Smart Meters are proposed to be offered or installed in every UK home and business by 2020. This post is based towards the technical challenges and consequences of smart meters as opposed to the generic advantages the providers give, looking at how they network and what happens to their data. Generally with Smart Meters Advantages Disadvantages No manual readings needed Have security vulnerabilities Can get cheaper prices off peak Personal data safety Informs you of energy habits Readings have to be verified You know exactly what energy you use when Real world savings UK estimated 2% Data collection British Smart Meters communicate back to the Data Communications Company (DCC - a Capita subsidiary) through SIM cards, the data is then transferred to individual energy companies. I originally thought they would connect via a powerline networking back to the substation and network from there which would limit the potential for external interference as the attack vector is limited to either physical access to the smart meter or interception between the home and substation, but instead smart meters are opened up to an internet of things and a lot of potential vulnerabilities. Is it actually secure? Diagram from ncsc.gov.uk The point in this network I would be most concerned about is the communications service which is exposed to the wider internet and at risk as a result. However the NCSC have put in some pretty neat failsafes into their network with the following: All smartmeters have unique authentication keys for each meter and message, reducing vulnerability at the smart meter’s side and making it very hard to reverse engineer Per role permissions, restricting who can do disconnects to only your supplier All meters have to be introduced and set up with a public/private key pair certified by the DCC’s own CA The active checking for a anomaly’s, i.e. if many service disconnect commands are send from an infiltrated supplier the commands will be ignored and the alarm raised Limiting the number of simultaneous connections, so not every household is connected at once Obviously, these don’t make the system impenetrable by any means, but an attacker would have to infiltrate both a provider and the DCC to be able to shut off even a few households. Who owns the data Onzo is one of the first companies to move in on this field of Big Data in the smartmeter industry, using customer data to create a personal profile and tailor ad or sales campaigns to customers. You have the right to opt-in or out of data sharing however with third parties by your energy supplier, likely by ringing the up or visiting their website. For the moment data usage is very restricted, likely as an attempt to reduce the large stigma and backlash that they’ve received. You can choose how often daily to send readings, if your supply details can be used for marketing and if third parties can see them. Where they don’t work Smart meters dependency on the mobile network, most mobile providers promise service covering 99% of the country on a coverage map, but travelling around you can see this really isn’t the case. Fitting smart meters in a semi-rural settlement or a home in a valley with no service will have no impact, as the meter will be unable to communicate with the service provider so manual readings will need to be taken. The smart meter network was designed to accommodate provider changes, however at least for now there are incompatibility issues within the smart meter network, as some of the first 8 million smart meters are incompatible with other providers, meaning that potentially you would need a new meter or having to take manual readings to move to a new provider, to simply reprogramme all of these smart meters it will cost at least £500m. …
Read more

Do our cars know too much?

In the era of the Internet of Things, where even fidget spinners can connect to the internet, cars are obviously of huge potential to go online to improve driving ability, avoid congestion, share traffic data and deal with mechanical faults. There are different ways your car could be connected to the internet; BMW, Audi/VW Group and most other car manufacturers now ship cars with so called smart features. iDrive (BMW) for instance can control comfort features (even remotely by preheating cars), alert emergency services in the event of a crash (by using onboard sensors), communicate with BMW if there’s a vehicle fault. There have also apparently been cases where BMW have repaired mechanical failures remotely such as a broken sunroof by remotely operating their motors. Basic features of the vehicle can be features through an app like as preheating and flashing lights. In addition to the things people have come to expect from cars, through map and media input and getting live traffic (usually from from Google). Tesla cars essentially have the display as the only way to interact with the car, potentially signalling the way all cars will turn when vehicle autonomy becomes commonplace, Tesla cars can also interact with smart home appliances too, in ways such as turning on the lights when you get home. The convenience potential for smart cars are huge However in the other hand, this means cars are essentially just devices with SIM cards which interact with the internet and so are exposed if they have vulnerabilities. With smart printers turning on you after being infiltrated by their default credentials and due to most items running embedded Linux (usually through busybox) a botnet can be created pretty quickly. The stakes are much higher however for security in your car over your toaster however, with car manufacturers support being able to control, start, stop or change any aspect of the vehicle through backdoors, so can potential hackers. Whether you’re comfortable with this kind of access depends on if you trust every feature inside a car being overridable from outside. This video well demonstrates just how scary the potential is: The Internet of Things for cars brings huge potential for the future, but as to whether security will keep up with this as time passes is a different matter. As to whether manufacturers will patch vulnerabilities in their cars into the future is something I really doubt, as many cars from 20 and even 30+ year old cars are still on the roads today. Will planned obsolescence be forced upon car drivers too, or will people have to choose between getting to work safely in a new car, or risking a journey in a hacked car that could be used to extort or injure them? …
Read more

Why 'Fully Loaded' Kodi Sticks are a horrible idea

Since last year, there has been a vast rise in the number of people buying ‘fully loaded’ sticks allowing them to access premium services including Netflix’s Original Series’, films still in cinemas and live Sky Sports streams. How does it work Kodi itself is not the system that provides pirated content, these are provided by the plugins which are installed atop of it. There are different types of plugins that serve content in different ways, from research the content is provided by: Either scraping the web for film uploads and serving these or using specially uploaded videos hosted on unlisted sites (apparently this is how popular plugins Genesis and Mobdro’s content is provided) Receiving a service (then using PVR and live streaming to the web) such as appears to be done with Sky Sport and other Sky services, the Kodi box then buffers a stream through a protocol (such as an RTSP) and displays it. Downloading a torrent in real-time and playing it, while seeding it silently in the background; this would result in better quality with lower latency as you have all the advantages of P2P (but is most likely to send you to prison as you’re technically distributing copyright materials). Are these a scam? In terms of price paid for these devices, they’re a pretty clear scam with a quick search on eBay for ‘Loaded Kodi Stick’ bringing up around 750 results. The devices sold are just normal smart TV sticks with a few apps sideloaded onto them (which are free off the internet anyway) and resold at a significant margin. The image of the stick shown right was just the top result with an Amazon Firestick (RRP: £35) with Kodi sideloaded and plugins ‘Mobdro’, ‘Exodus’ and other repositories promising free HD TV and films as well as live sports from both BT and Sky. So along with the content, the markup charged on these devices is also criminal! Honeypots and logs Chances are that the web server that you’re connecting to stream your theoretical films is logging the connection source IP and content accessed (and unless you’re using a VPN, that’s going to be your home IP they have). In the event of a file sharing website being seized by the authorities there will be around a days worth of logs (depending on the host’s retention policy) detailing IP addresses. On the other hand, connecting to a honeypot is also a possibility which would serve you a copy of the media you want while recording your IP, in this instance it’s a matter of not if, but when your ISP contacts you about copyright activity. Streaming such files is likely not worth the potential trouble. Am I going to prison? As to whether you’re going to end up in a cell for watching The Crown from your Kodi stick is a different matter. It’s really dependant on the plugins you’re using, with those that silently seed programmes in the background being far more likely to get you a warning letter from your ISP. The police have so far come after several people in the UK for the distribution of these Kodi devices but are aiming to shut down the distribution at the source as opposed to stopping individual users; obviously it’s still inadvisable to stream in such ways though. Conclusion Kodi is not enjoying this recent influx of users with their identity being tarnished by association with these brands, and support forums filled with questions about non-functional plugins and broken links. As to the ethics of piracy, it’s an interesting problem and not an issue with the demand for media. An overwhelming majority of people are willing to pay for content but just can’t access it at a reasonable price. I wouldn’t advise going out and buying one of these streaming boxes as support from these ‘vendors’ is pretty lacking, you’re living your life in a legal grey area and chances are your ISP won’t hesitate to give up your information to a copyright owner who comes calling about piracy from your IP address. …
Read more

'MIS'ery, education IT should be open

MIS’ (or Management Information Systems) are arguably the most important part of any school (whether primary or secondary) or colleges. In both my primary and secondary school/college Capita’s SIMS (School Information Management System) has been the MIS of choice and is used in 83% of schools in the UK. This is, plain and simple a monopoly. The 2010 Report by the IoE said that the marketplace was uncompetitive, dominated by a lone supplier that’s increasing in cost, violating both EU and UK laws on procurement and without an open and using a platform without an open or shared data format. Capita SIMS became a monopoly when local authorities purchased it all the schools they ran, acting as their initial MIS and have stuck with the system since then as it’s easier to remain with the same platform. I’ve only had real experiences with the Lancashire Education Authority, who run SIMS although I believe there are also similar problems in other LEAs. Over the years, SIMS has expanded its tentacles from just core utilities like pupil management to everything from parent online access to dinner money management, all at pretty hefty additional costs. Inside the walled garden there is little space for third-party developers and plugin makers, with official API access requiring you to be a Capita Partner, something that costs several thousand pounds. The rather descriptive phrasing of an EduGeek user calling SIMS a “massive sea anchor of a product” due to its low initial cost for LA schools to be drawn in but with high priced plug-ins on top. It stalls advancement of technology in education too by locking teachers to Windows due to the programme’s codebase being written in Micrsoft’s .NET framework and using Microsoft SQL server for a backend. There are obviously some hacks to make this work on other platforms such as macOS and Linux, including delivering them as Citrix hosted apps for staff or wrapping them with Wine (however this would cause nightmares for support); but platforms aren’t the core problem, the problem is the software in the first place. SIMS has some problems in itself, being an unreliable beast with all sorts of issues being had with its database having to be mapped as a network drive in Windows and Microsoft SQL server sitting beneath it, at the very least the whole programme needs a bottom up rewrite. The question is, is now the time for some government intervention? If private companies such as Capita have shown that they can’t write a functional programme or keep up with the times, surely GDS or the DfE could build up something: That has a nice GUI, doesn’t need to be something fancy but just functional Easy accessible reporting strategies and statistics clearly available to staff Free for all (or very cheap) and open source Make all key parts of a modern education system core parts of the system, handling student info, timetabling, achievement, behaviour management, dinner money and parent tracking Build for the future, build a web based frontend system (wrap it in Electron maybe) Don’t lock out the developers, publish an API and add plugin functionality for the system meaning if people have great ideas for expandability they can use them. A system like this with core functionality baked in, made open source and developed by the Government could easily save schools thousands of pounds with SIMS Learning Gateway (web access for parents) alone costing ≈£9k for installation and another £1k a year in maintenance. This new system wouldn’t lock out the vendors like Capita either, as they could make additional functionalities through plugins or provide support services for the system. A serious reform is needed of MIS’ post the 2010 Becta report, and not much seems to have changed yet. Another issue with just creating a new system though is that itself could become a monopoly, maybe an open standard for education would be better and allow schools to move between different MIS’. …
Read more

My Software Toolkit

Here is a list of the more uncommon Mac Apps that I like, and are an important part of my everyday lives. Beyond the obvious ones like browser (I use Chrome, by the way), I thought that it would be interesting to share what my software choices are and why. This page now lives at /toolkit and I’ll update it over time. Take a look If you have any nifty apps or utilities to recommend, feel free to tweet or email them to me. …
Read more

Why I run this blog on Jekyll

I launched this blog after buying the domain na.thaniel.uk to replace my social media site, and decided that I needed a blog to discuss my projects and ideas for technology. I shopped around the different blogging platforms, looking at the usual free offerings in Wordpress.com, Blogger, Medium and Ghost (the new platform on the street, at the time). Initially I launched the site in early January 2016 hosted on Tumblr, as it allowed me the different theming options I wanted, however returning posts in certain taxonomies was limited and I soon learnt that the platform wasn’t flexible enough to build the site that I really wanted. Then I discovered GitHub pages, after seeing repositories using it for their own websites I decided that it could be perfect for the use case that I have. After downloading Jekyll and getting it running; I realised it was awesome. Taking the default theme and heavily modifying it, I ended up with my first version of this blog. All my posts were already written in Markdown, on Tumblr already, after I read about it being the choice of markup used for GOV.UK, Daring Fireball and other sites, as more convenient way to write, writing in one way using markings like **Hello** for bold and then defining the CSS for this everywhere, saving the overhead of having to do <span class="head-post-bold">Hello</span> each time. All my posts ported across fine to the new platform with only some minor tweaks being needed. I discovered many other benefits to using Jekyll on GitHub Pages though that have kept me using it, including No more having to mess around configuring a database, maintaining a database or restoring a database when it inevitably breaks More interesting as there are few GitHub compatible to fall back on, challenging me to be more creating when making things work and learning how to use Jinja2 for querying posts The workflow is easy, just open a new markdown document, fill in some front matter, write the post and push to GitHub, job done. None of the bloat or worrying about hacking I would have with a free-standing Wordpress instance, at the end this whole site is just a set of HTML, CSS and MD files sat behind 2FA through GitHub. It’s free, and free to use with a custom domain too which is awesome SSL is easy to set up with CloudFlare which is really handy too.s I’d recommend GitHub pages as a blogging platform replacement for those who like to keep control of their site’s design and functionality and are technically inclined; it’s also a lot of fun to work with. …
Read more

The UK's unified banking API is coming, and it will be great

The UK government has given support for the creation of a universal API to allow access to UK bank account data, in a move to increase competition between banks, but also to allow innovation in the finance industry to give customers a clearer understanding of their savings. The OBWG (Open Bank Working Group) established by the UK Government to find ways to improve interaction with banks in the UK, they proposed that “bank data…should be made open data” which encompasses products that they offer, to increase consumer choice. “Open APIs should be built using customer data” which would mean that apps like ‘Mint’ or ‘You Need A Budget’ would be able to use data straight from accounts about spending to break down where you can save money, which would allow for smarter spending. APIs are great, and can allow developers to integrate as can be seen by how they’re being rolled out across government (data.gov.uk). It’s important that these would be opt-in (obviously) and provided as an extension to online banking. The hardest part about implementing this, I imagine, will be the legacy systems that banks use (especially as RBS’ system alone has caused them a major meltdown and prevented two break up attempts) and creating an interface between the legacy banking systems and modern mobile apps, reducing the costs to the banks (although the old systems really ought to be rebuilt or upgraded) but this works in the interim. An official API will be unquestionably more secure than current systems used by data providers such as Yodlee and Finicity that actually log in using your credentials and screen-scrape data from your account (logging in with your credentials, loading web pages copying the web pages source and stripping financial data out), mainly the issue with this is that most banks will not protect you in the case of them having a security breach and your money being taken, as you authorised them to use your account. Despite me being sure that scrapers have good security practices, I’m certainly not risking my accounts being emptied overnight. When it’s released with initial debut meant to be early 2017, with a fully fledged product including consumer and business data being available by 2017 developers will be able to build some amazing applications upon this interface. Consumers will also get better deals on their accounts with the API allowing developers to see all accounts types that banks offer and offer the best to their customers (potentially a automatic MoneySavingExpert like system, returning the best account type for an individual based off their own spending habits). …
Read more

Digital by Default is how government should be done

The Government Digital Service didn’t just make a website, they redesigned the way user interacted with Government through user journey maps, reducing unnecessary bureaucracy and “Transforming the relationship between citizen and state” as Matt Hancock (Cabinet Office) said. By removing the need for 320 separate government websites and merging them into one different domain they saved £3.5 billion in overall costs to the taxpayer since 2012. This set an example worldwide with the UK winning the best E-Government website and the UK’s open source codebase being implemented by GOVT.NZ and the US Digital Service. The removal of the need to go to the Post Office to renew your tax disk or send off for, fill in and send back a pack of forms to renew your passport is really great. Government agencies such as the DVSA have really embraced the opportunity to go digital first with services allowing you to hire cars by proving your driving licence online and checking a vehicle’s history for free before you buy it just with the registration and manufacturer. Previously the DVSA had contracted out all of their work to Fujitsu for 15 years and after moving back in they’ve reworked all their customer-facing services in a more efficient way, showing that in house development really does work better. The latest project that the GDS are working on is register creation, creating lists that other government platforms can be based upon and allowing services to share data in one system as opposed to hundreds of different ones around government such as open registers (such as open electoral), closed registers (such as land registry) and private ones (such as organ donor status). The process of researching, testing and then going back to the drawing board is all described on the blog and it’s great to see that process. Unfortunately, not all of the government agrees that GDS is doing a good job, and would much rather it would be put back to the way it was before. The old head of GDS Stephen Foreshew-Cain was replaced by Kevin Cunningham (who wrote of £1.5 billion on the Universal Credit IT programme failure). According to inside rumours there was a ‘minor coup’ over summer and now there is a stand off between the DWP and HMRC who were amid their own transformation projects at the start of GDS and would like to see the agencies go back that way, which is a worry. If GDS were to be dismantled, I think we’d see a return to the old ways of Government IT, contracted out for hundreds of millions of pounds more than the contracts are actually worth and providing less of a return on investment, it will also oust people at the service currently working in a more effective way that users actually like. I’ve followed the GDS block for the last three years and it’s been intriguing to see how they move from concept to production with care at each step, even moving to write ‘plain English’ to increase accessibility (even in HMRC to make the system actually understandable to the average reader) which are realistically not going to be developments you get when contracting to the lowest bidder. The NHS are starting their own programme with a £4bn redevelopment programme following on and using GDS’ prototype system, the pushback they will likely face going digital by default across the entire of a disconnected health service is likely going to be huge though. …
Read more

Have a continuity plan

From working at the Library, I’ve learned a lot of things, helping people from all walks of life with more different queries than I could count. Unfortunately, something that has affected a surprising large group of people I’ve met are death of a loved one and wanting access to their data. Modern systems are designed to be secure which can be double edged blade. They’re designed to protect people from being impersonated through multi-factor authentication (something you know, have or are) although they have weak points such as the swap-sim method but this enhanced security can end up locking out loved ones if the worst happens. The most common situation I have had is with Windows laptops (mix of 7-10) where family have passed and people want to see their photos or files from their laptop which is sorted easily with Hirens (if legacy BIOS) or the sticky keys command prompt method (if UEFI boot), the problem here is that if people have BitLocker or FileVault enabled I can’t help them (which is on the other hand great for stopping thieves), with Google having a process to request access but if it’s an Apple device you’re out of luck as they don’t unlock devices for security reasons. I’ve had several people with iPads which are activation locked by the deceased’s Apple ID so cannot be used again nor purchases be accessed (making the device a very expensive paperweight). On the other side of this, devices becoming harder to break into is good as if your device falls into the wrong hands it’s safe, and you’re protected from being forced to hand over data if your device has been seized (or copied at a border). If you have important files on your devices (sentimental or otherwise), record your passwords somewhere for in case the worse happens. If you use a password manager (like 1Password, LastPass or KeyPass) the easiest way to protect yourself from disaster is to create yourself an emergency kit and lock it away in a safe or filing cabinet which can be accessed if something was to happen to you. Be proactive in these situations, don’t leave it to others to be reactive. …
Read more

Nightmares from upgrading Ubuntu 14.04 to 16.04 LTS

This week in my half term; I decided that it would be a good idea to update my home server (an Intel NUC) from Ubuntu 14.04 to 16.04 LTS but the update wasn’t as straightforward as I had hoped (it never is!) To update I thought I’d use SSH, but after receiving warnings in the terminal that if the session dropped it would cancel my update I decided to use RDP (through xRDP) instead. This started off great and the update was running well, until my session dropped and was lost by the RDP server (I couldn’t find it in the logs to reconnect to). I then was unable to get back in and SSH connected after a good 20 minutes of waiting. It seemed that the update had halted but DPKG was still locked, this command sudo rm /var/lib/apt/lists/lock removed the lock (thanks AskUbuntu). I attempted to resume the install using sudo apt-dist upgrade which I had used previously but was told I was already up to date on 16.04. After another few questions on StackOverflow I found apt-get update,apt-get -f install,apt full-upgrade should return me to a working system, in total they took about a day to run, but eventually got going again. The first thing that wasn’t working when I got back in were my network services which had all stopped, I got Apache, Plex, and the others running again before noticing that my NAS drive mounting had failed with mount.nfs: an incorrect mount option was specified, it was previously: 192.168.1.3:/volume1/photos /home/serviceusr/Desktop/photos nfs ,username=serviceusr,password=passwordhere,_netdev,rw,hard,intr,nolock but after following some tips I got it working again using: 192.168.1.3:/volume1/photos /home/serviceusr/Desktop/photos sec=sys,intr,rw,vers=3,timeo=11,auto,async,bg 0 0 Docker also didn’t work after updating, I had to re-add its repository (disabled in the update) and install it again, the best guide for this is here. Previously I had run all my containers by using --restart=always on a container but this didn’t survive the update, instead I added them into my /etc/rc.local file for the future, like this: docker run -p 8443:8443 -p 8080:8080 -p 8081:8081 -v /var/unifi:/var/lib/unifi -d jacobalberty/unifi:latest I also decided that I wanted a better backup system, as Syncthing on my Synology just wasn’t cutting it (running out of memory and restarting and stopping every 15 minutes) so I mounted the folder containing the users home folders with fstab on my server and then ran: curl -s https://syncthing.net/release-key.txt | sudo apt-key add - sudo echo "deb http://apt.syncthing.net/ syncthing release" | sudo tee /etc/apt/sources.list.d/syncthing.list sudo apt-get install syncthing Then had to run these, replace user with the account you want on the server: sudo systemctl enable syncthing@$USER.service sudo systemctl start syncthing@$USER.service This failed for me, and it took me ages to work it out with some vague errors, turned out I didn’t own the directory or executable, so run: sudo chmod 777 -R /usr/bin/syncthing sudo chmod 777 -R /home/USER/.config/syncthing/ The web interface should then be available on server:8484, the SSL certificates should be changed (if you don’t like security errors), they are /home/USER/.config/syncthing/https-cert.pem and home/USER/.config/syncthing/https-key.pem, I just symlinked them to my apache directory (easier to change my wildcard when it expires that way). And that’s me up to present, hopefully this helped you if you received the same errors! …
Read more

Grav is awesome!

I have recently moved my home internal website from OctoberCMS to Grav, a cool new CMS without a database. Both also have clean backends, but I fancied a change in CMS and it seemed like the best option. OctoberCMS is nice but I like the native ability to write markdown in Grav, its cleaner layout and how it’s far easier to install (I had many database issues the first time I installed October), twig templates are also great and something that I’m used to from Flask and allow me to create pages far more quickly, but are not something I need regularly as the site is pretty static. I didn’t add any plugins beyond the basic ones either (OctoberCMS had more plugins, but most were paid-only), but didn’t need them for this site. Writing pages manually is great too, as it’s pretty much the same as Jekyll (used for this site) with the configuration options at the top of each file. The only disadvantages that I can see of Grav at the moment is you can’t get too complex with it as there isn’t a database, but the ability to back up the entire site by simply copy and pasting the folder is awesome (and great for Git versioning too). Obviously web design varies, and both allow full customisation on how your site looks, but for reference (left and right are old and new respectively): …
Read more

Lessons learned from running Cat6

From about 2008, we’ve used Comtrend Powerline Adaptors (due to them being shipped free with BT Vision). I’ve never really been a fan of them as if something spontaneously stops working, you can bet they’re the problem. With us getting the new UniFi APs and ceiling mounting them, running cables to each we thought it would be the best time to run cables to the rest of the house too. Dropping cables was as more of a pain than I thought it would be (old house, solid walls, joists running against us) but so far I’ve managed to connect up the key rooms. Advice I’d give though from this is to get a cable access pole kit (£10 as a Aldi specialbuy woo), they’re invaluable in finding concealed blockages and navigating down walls, also run more cables than you think as with HDMI over Cat6 available you may use cables for other purposes. I terminated all of my cables in the loft into a 16 port patch panel and with a faceplates in the house, but I’m not done yet as I’d like to future-proof the whole house by putting drops in every room where future devices could potentially go. My patch panel positioning still has something to be desired (and I need to find a way to mount my Ubiquiti POE injectors) but apart from that it’s awesome to have a network that isn’t horribly sluggish! …
Read more

The 'Smart Wallet' is still a fantasy

Passbook was released with iOS 6 in 2012, promising to be a more convenient way to store vouchers, loyalty cards, coupons and event tickets. With the release of the iPhone 6 in 2014, Apple Pay came too (first to the US, then UK and other select countries) and introduced a new way to pay. In the UK, NFC has been a standard in the UK since 2007 with merchants rolling out support at differing speeds. Apple’s launch of Apple Pay in the UK in July 2015 was something I thought was going to be truly market changing, but it wasn’t. That was due to two reasons; lack of supporting banks and the facts that wallets (or purses) aren’t just home to credit and debit cards, they also home a ton of loyalty cards (usually a Booths card if you’re northern and like free coffee), ID and membership cards. Loyalty There isn’t a loyalty card platform either, with Apple introducing loyalty in 2015, it wasn’t really adopted widely which is likely because of how much of a pain it is for merchants to introduce them. If you make a digital reward card for iOS, you’re also going to have to make one for your Android customers too, which is just an inconvenience. I think that if there was an open standard in place for id cards, like the .pkpass files used previously for passbook barcodes and flights, it would mean that more stores would be more likely to implement this for their own loyalty systems, currently passbook’s standard is proprietary and requires a developer account to sign the pass (although this makes Apple some money, it’s likely prohibitive to some stores). A format that could have a loyalty number passed over NFC, requested by the POS machine would be ideal. So in addition to the card info being passed the loyalty number could also be given. This could be provided from a file that contains store info could be in a similar format to the one below, and could work on any platform: Store loyalty card   Store name Booths Store logo Store logo image Store header image Large image of side of card Customer name John Smith Background #444C3F Foreground #FFFFFF Loyalty number 003602 URL of customer system siteurl/loyalty/003602 The system could then respond at the url a response with the number of points, which could be then displayed in app. The problem is with this, it’s unlikely to be ever implemented by Apple or Google as they have far too much interest in retaining userbase, an easily movable wallet would break that ecosystem. So here's a little prototype of something we're working on #drivinglicence pic.twitter.com/a5eItrdiNI— Oliver Morley (@omorley1) 13 May 2016 Identity Realistically, we are not going to lose a wallet any time soon, especially for those who look under 25 as they’re likely to be ID’d whether going to a club or just buying alcohol from the supermarket. In May this year, the DVLA’s CEO posted the tweet on the right, showing the future of an ID which would be much better. Most people born in at least the last 20 years carry a phone constantly so being able to carry the driving licence would be incredibly useful. The flip side of this issue is that it would be incredibly easy to commit fraud so it would require another system like the one above, with a one time use code where a number is displayed beneath the ID (and in QR form) and can be queried by pasting into a government service which would return the information shown on the card with an image (this could then be implemented within an app for bouncers and the police and allow data to be validated). The one-time use code changing would prevent those who look up from returning to the ID after validating and would stop fake IDs being produced with the same value, if accepted instead of plastic ID too this would entirely prevent fake ID as you can’t spoof a gov.uk domain and valid HTTPS certificate (well, not without significant difficulty, changing their local DNS and becoming a trusted intermediate CA). Membership cards Membership cards and access cards are also struggling to get with the times. They could do with a system like the one I mentioned for a supermarket but use the devices build in NFC card to pass the loyalty number, replacing the current magnetic strip system used at many gym clubs for cards. Allowing membership cards to use NFC could also work for work ID systems which would be more secure than just possessing a card as it requires the fingerprint. Membership card   Membership name Gym Limited Membership logo Membership logo image Membership header image Large image of side of card Customer name John Smith Background #FEFEFE Foreground #B30098 Loyalty number 00354322 URL of customer system siteurl/member/00354322 Concluding Overall, I think that virtual wallets still have a lot more potential for the future than what they’ve shown so far and I’m excited for that, however I’m a strong believer that open is better when it comes to standards and if technology companies would just share there would be far better offerings for the community. …
Read more

Is AI a threat to mankind?

Yesterday, I visited the University of Liverpool University (Heseltine Institute) Policy Provocations seminar on Artificial Intelligence to aid my extended, currently with the working title ‘Artificial Intelligence: Friend or Foe?’. I found out about the seminar and decided it would be a great way to widen my knowledge of the area, and to get some further opinions on the technology’s ethics. It was chaired by Dr Roger Phillips (BBC Merseyside) and the panellists were Professor Simon Maskell (Liverpool engineering/computer science), Joanna Bryson (Bath University Natural Intelligence) and Sir Robin Saxby (ARM holdings). They debated whether they were excited or scared by the future of AI, talking about logical and emotional intelligence, can they solve prejudices and do we trust the machines themselves, are the risks actually covered (why does society mistrust). I got some great notes and it was a really interesting evening! Seminar full video - External Video Player from liv.ac.uk …
Read more

Home is not the enterprise

Recently when I set up my UniFi UAP access points, I exchanged my old WPA2-Personal network for WPA2-Enterprise thinking that it would be simpler for my family and more secure. However, I quickly hit some unfortunate snags that have led me to revert the network back to WPA2-Personal; firstly how lacking consumer devices are was my main issue with devices like Chromecasts and game consoles not supporting the standard, but that was fine as I could just set up another SSID for these devices with a long and random password. However when I configured the RADIUS server on my Synology NAS to be used for it and moved everybody across we hit the issue that the authentication would hang for roughly 30 seconds every time the device roamed or left and returned to the house. I couldn’t find a better LDAP/RADIUS server that works on Linux with a good web interface (if you know one please let me know). However I’d rather not spin up and have the overhead of a VM of Windows with Active Directory. I have learnt that although a technology may be easy to implement and work for myself, I have to design the best system for all of the users which is near enough zero-management and works relatively well unattended. …
Read more