This is something that I’ve been working on for over 2 years now.
I’ve mentioned it before, but I thought I’d add some photos.
And we have the cameras hooked up to cool ‘robots’ / gantry systems:
This is something that I’ve been working on for over 2 years now.
I’ve mentioned it before, but I thought I’d add some photos.
And we have the cameras hooked up to cool ‘robots’ / gantry systems:
Heh, funny title.
I became interested in machine learning working for Nokia. I worked on Nokia’s Z Launcher application for Android. You can scribble a letter (or multiple), and it would recognize it and search for it. The app is available for download in the Play Store.
I worked on the Nokia Z Launcher’s handwriting recognition
Specifically I was tasked with optimizing the speed of the recognition. I don’t know if I can state any specifics on how the character recognition was done, but I will say that I managed to increase the speed of the recognition a hundred fold.
But this recognition was actually a relatively simple task, compared to modern day deep neural networks, but it really whet my appetite to understand more.
When Alpha Go beat Lee Sedol, I knew that I simply must understand Deep Neural Networks.
Below is my journey in understanding, along with my reflective thoughts:
Passed the Coursera Machine learning course with 97.6% score.
The lecturer, Andrew Ng, was absolutely awesome. My complaints, really, boil down to that I wish the course was twice as long and that I could learn more from him! I now help out in a machine learning chat group and find that most of the questions that people ask about TensorFlow, Theano etc are actually basics that are answered very well by Andrew Ng’s course. I constantly direct people to the course.
I needed to rename a whole load of files that were in Japanese. So I wrote a python program that translates the filename using google translate.
It’s not at all fancy, just run it and pass the filenames to translate and rename.
E.g.:
$ ls こんにちは世界.png $ sudo pip3 install googletrans $ translate_rename.py こんにちは世界.png こんにちは世界.png -> Hello_World.png $ ls Hello_World.png
#!/usr/bin/python3 import sys, re, os from googletrans import Translator translator = Translator() sourceLanguage = 'ja' destLanguage = 'en' # Set to false to actually rename the files dryRun = True def translate_and_rename(filename): filenameSplit = filename.rsplit('.',1) translated = translator.translate(filenameSplit[0], src=sourceLanguage, dest=destLanguage).text translated = re.sub( '[^a-zA-Z0-9.]+', '_', translated).strip().title() if len(filenameSplit) > 1: translated += '.' + filenameSplit[1] if filename == translated: print(filename, ' (unchanged)') else: print(filename, " -> ", translated) if not dryRun: os.rename(filename, translated) def main(argv): if len(argv) == 1: print("Need to pass filenames to translate and rename") for x in argv[1:]: translate_and_rename(x) if dryRun: print() print(" Dry run only - no actual changes made ") print() print("Edit this file and set DryRun to True") if __name__ == "__main__": main(sys.argv)
Maybe a clickbait title, sorry, but I couldn’t think of a better title.
The CPU ‘Meltdown’ bug affects Intel CPUs, and from Wikipedia:
Since many operating systems map physical memory, kernel processes, and other running user space processes into the address space of every process, Meltdown effectively makes it possible for a rogue process to read any physical, kernel or other processes’ mapped memory—regardless of whether it should be able to do so. Defenses against Meltdown would require avoiding the use of memory mapping in a manner vulnerable to such exploits (i.e. a software-based solution) or avoidance of the underlying race condition (i.e. a modification to the CPUs’ microcode and/or execution path).
This separation of user and kernel memory space is exactly what I worked on from 2012 to 2014 on behalf on Deutsch Telekom using the L4 hypervisor:
The idea was to give each service its own separate memory space, designing in a way such that you assume that the main OS has been compromised and is not trustworthy (e.g. because of the Meltdown bug). I personally worked on the graphics driver – splitting the kernel graphics driver into two parts – one side for the app to talk to and has to be considered compromised, and one side that actually talks to the hardware.
Here’s my work in action:
Yes, I did actually use Angry Birds as my test. Fully hardware accelerated too 🙂
Unfortunately the problem was that it took too long to port each phone across. It took me a year to port across graphics driver changes, and a similar time for my colleagues to do the other drivers. And then another year for it to actually hit the market. The result is that the phone was always over 2 years out of date by the time it hit the market, which is a long time in mobile phone times.
Still, our software would be immune to this type of bug, and that’s kinda cool. Even if it did fail commercially 😉
I love TypeScript. I use it whenever I can. That said, sometimes it can be… interesting. Today, out of the blue, I got the typescript error in code that used to work:
[06:53:30] typescript: src/mycode.ts, line: 57 Property 'video' does not exist on type 'number | (<U>(callbackfn: (value: Page, index: number, array: Page[]) => U, thisA...'. Property 'video' does not exist on type 'number'.
The code looks like:
return _.chain(pages) .filter((s, sIdx) => s.video || s.videoEmbedded) .map((s, sIdx) => { if (s.video) { ... }
Can you spot the ‘error’?
The problem is that s.video || s.videoEmbedded isn’t returning a boolean. It’s return a truthy value, but not a boolean. And the lodash typescript developers made a change 1 month ago that meant that filter() would only accept booleans, not any truthy value. And the lodash typescript developers are finding that fixing this becomes very complicated and complex. See the full conversation here:
https://github.com/DefinitelyTyped/DefinitelyTyped/issues/21485
(Open issue at time of writing. Please leave me feedback or message me if you see this bug get resolved)
The workaround/fix is to just make sure it’s a boolean. E.g. use !! or Boolean(..) or:
return _.chain(pages) .filter((s, sIdx) => s.video !== null || s.videoEmbedded !== null ) .map((s, sIdx) => { if (s.video) { ... }
Just a light hearted post today. My daughter had a great idea to build a felt house for her teddy cat, so we did it together.
It was a fun project, and simple to make. The wood is just children’s blocks glued together, and the felt is stapled on. The strap was cut from an ecobag. A piece of velcro was added to hold the door open.
I played about with making a robot face. I combined a Ultrasonic Sensor (HC-SR04) and NeoPixel leds. The leds reflect the distance – one pixel per 10 of cm.
I’m wondering if I can do SLAM – Simultaneous Location And Mapping with a single (or a few) very-cheap ultrasonic sensor. Well, the answer is almost certainly no, but I’m very curious to see how far I can get.