Jun 5, 2017

Business Chat - Apple Developer

05 June, 2017

link

Business Chat is a powerful new way for businesses to connect with customers directly from within Messages. Using Business Chat, your customers can get answers to questions, resolve issues and complete transactions on their iPhone, iPad, and Apple Watch. Customers can find your business and start conversations from Safari, Maps, Spotlight, and Siri.

Is this what I think it is? i.e. Bots for Apple’s Messages App? It looks like we may need to wait until Friday to find out…

May 18, 2017

TensorFlow Lite Introduces Machine Learning in Mobile Apps

18 May, 2017

link

TensorFlow Lite, a streamlined version of TensorFlow for mobile, was announced by Dave Burke, vice president of engineering for Android. Mr. Burke said: “TensorFlow Lite will leverage a new neural network API to tap into silicate specific accelerators, and over time we expect to see DSPs (Digital Signal Processors) specifically designed for neural network inference and training.” He also added: “We think these new capabilities will help power the next generation of on-device speech processing, visual search, augmented reality, and more.”

A really interesting move. I remember mentioning the use of mobile-hosted neural networks a year ago, when Apple launched an inbuilt neural network API in iOS10. Apple’s isn’t typically acknowledged as an AI leader, and so perhaps being first out-of-the-gate with this trend explains some of the scepticism I received at the time.

But I think there’s something big going on here.

Apple has neural network support in iOS10. Google just launched TensorFlow Lite for mobile. And Facebook has Caffe2Go, its mobile-friendly Machine Learning library.

Importantly, the training of these models stays on the server - but the execution of those models moves to mobile devices. And as mobile chip makers increasingly pivot their GPU efforts away from just graphics to meet the needs of more general-purpose ML workloads, those devices become pretty powerful.

And we’re going to need this mobile power, because we may be about to see a huge explosion in mobile capabilities. Once Augmented Reality (AR) becomes a standard feature with depth-sensing cameras, our phones will be able to map what they see in 3D space. Then it starts getting interesting! If the mobile device has the power to do the ML work, why not exploit it? It’s going to be cheaper than building a server farm for sure.

May 15, 2017

A16Z AI Playbook

15 May, 2017

link

Artificial Intelligence (AI) is a set of computer science techniques that, as Stanford professor Andrew Ng is fond of saying, gives your software super powers.

Building on our Primer on Artificial Intelligence, this microsite is intended to help newcomers (both non-technical and technical) begin exploring what’s possible with AI. We’ve met with hundreds of Fortune 500 / Global 2000 companies, startups, and government policy makers asking: “How do I get started with artificial intelligence?” and “What can I do with AI in my own product or company?”

This site is designed as an resource for anyone asking those questions, complete with examples and sample code to help you get started.

While there are dozens of excellent tutorials available on the web (once you’ve figured out what library or API you want to use - we’ve listed a few of our favorites in the Reference section), we felt a pre-tutorial – a “Chapter 0” if you will, was missing: something that would help you survey the landscape broadly; to give you a sense of what’s possible; and help you think about how you might use artificial intelligence techniques to make your software smarter, your users happier, and your business better.

Quite simply, this is a stunning resource. AI tends to be populated with either meaningless hype or indecipherable mathematics - with nothing much in between. I really can’t praise this site enough - it’s hands-down the best, well grounded, introduction to a sometimes very complex subject.

May 14, 2017

API.AI “Small Talk” Is Now Open! Why Is It A Big Deal?

14 May, 2017

link

And this update itself solves both of our problems: you can easily open any intent, change response and add more training data (take a look at the screenshot below). And that’s super awesome! 🚀

Smalltalk from API.AI is free and open - allowing others to change, tune, modify as they wish. This is how we move things forward. In conversational systems we can’t succeed if such things are closed and proprietary. That’s because they can’t be improved by others.

Luckily there’s healthy support for open source in the bot world. We’ve seen how a plethora of bot connectivity frameworks have led to innovation in integration - through things like MS Bot Framework, BotKit, BotMaster, Droid, Bottr, etc. If you want to play in bot connectivity, it has to be open and free - because everyone else is.

IMHO we are about to see the same with conversational content - intents, entities, dialogs, etc - published for others to use and improve.

May 12, 2017

Amazon Launches Redesigned Alexa App for iOS – MacStories

12 May, 2017

link

One of the app’s three main navigation tabs is devoted to messaging. You can send voice or text messages to anyone with an Amazon Echo product, or anyone who has the Alexa app installed on their iPhone or iPad. The app’s various communication methods are all easy to use. Pressing and holding the microphone button will record a voice message. Once you send it, the message is immediately transcribed so that recipients can either listen to the audio, or read its contents within the app. If you’d rather send a text-only message, you can tap the keyboard in the bottom right corner to do that. Phone calls are also possible from the same screen, by tapping the phone icon in the top right corner.

If you’re on the receiving end of a message, whether it’s voice or text-only, you’ll be notified about it both through a standard iOS notification and through your Echo device. From there it’s up to you which device you’d prefer to listen to or view the message from.

Of course you might expect me to be leading with the news of the Echo Show - an Echo with a screen. But, no, I think the more interesting Echo/Alexa news is the upgraded Alexa app and it’s new messaging features. Now Alexa - both in the app and in the Echo device - can be used to send/receive messages.

Unfortunately messaging doesn’t appear to yet be turned on in the UK, so I’ve not managed to check it out - but I am fascinated to give it a try!

May 8, 2017

This is Amazon’s new Echo with a built-in touchscreen | AFTVnews

08 May, 2017

link

Here is the first look at what Amazon’s upcoming Echo with a built-in touchscreen looks like. I was able to locate this unfortunately low resolution image on Amazon’s servers in a similar manner to how I found what we now know to be the Echo Look. This new member of the Echo lineup is codenamed “Knight” and is expected to be released later this month.

Well, my one complaint with Amazon Echo has been the lack of a screen. Without a screen it’s hard to be inspired. For example, I rarely know what music I want - but flipping through my library often jogs the memory. Without a screen, Echo is quite limited.

Interestingly, Apple’s Phil Schiller seems to think the same. He’s quoted here as saying he thinks the screen is important on Assistant devices. Now Phil and Apple are not the kind of folk to say this by accident - so this must be Apple’s corporate thinking. I wonder what this means for the rumoured Apple competitor to Echo?

These devices promise a huge uptick in Conversational systems - we’re going to be talking to devices much more, and so the need for systems that understand language will only increase. #ConvComm is the future!

Apr 29, 2017

Caffe2 on iOS, Deep Learning Tutorial | iOS Swift Tutorials by Jameson Quave

29 April, 2017

link

At this years’s F8 conference, Facebook’s annual developer event, Facebook announced Caffe2 in collaboration with Nvidia. This framework gives developers yet another tool for building deep learning networks for machine learning. But I am super pumped about this one, because it is specifically designed to operate on mobile devices! So I couldn’t resist but start digging in immediately.

I’m still learning, but I want to share my journey in working with Caffe2. So, in this tutorial I’m going to show you step by step how to take advantage of Caffe2 to start embedding deep learning capabilities in to your iOS apps.

There’s a pretty solid assumption across most of the tech industry that “AI runs in the cloud”, or at least on some beefy servers in a data centre. But not necessarily, as this article shows. It’s now possible to run one of the major ML systems on an iOS device. That might seem strange at first glance. But I don’t think it’s quite so crazy. The GPUs (such workloads need a GPU for is parallel processing capability) in these devices are getting pretty capable - granted not competitive with a real beast of a high-end desktop GPU - but not all ML workloads need that kind of grunt.

I’m willing to bet there’s a class of ML work that can be done quite effectively on an iPhone. And we know already that Apple is making some major investments in a new generation of GPU architectures in its custom mobile chips - quite possibly spurred-on by the need to run not just graphics algorithms, but ML ones as well.

So the emergence of Cafe2 as something that runs on an iOS device right now is pretty interesting. Let’s not forget that these devices are likely to morph into mobile processing machines for AR and VR in the not too distant future. Just consider the volume of visual and sensory data generated in that environment and the probable need to make sense of it with ML algorithms.

So, no, ML isn’t just about the cloud. We’re going to see it on our iPhones and iPads as well. Not all workloads, but some. I’m willing to bet the future is a hybrid one.

Apr 28, 2017

Apple’s Echo-Like Smart Speaker With Siri and AirPlay Could Debut as Early as WWDC - Mac Rumors

28 April, 2017

link

Dickson said that Apple is currently “finalizing designs” for the Amazon Echo and Google Home competitor, which he expects to be marketed as a Siri and AirPlay device. “It is believed to carry some form of Beats technology,” he added, while noting that the device will run a variant of iOS software.

Dickson later told MacRumors that the device, allegedly codenamed B238 internally, will feature a Mac Pro-like concave top with built-in controls. His source, which he told us is “someone inside Apple,” described the device as “fat” like the Google Home with speaker mesh covering the majority of the device.

AI-enabled voice tech is becoming a big thing. I know many people with Amazon Echos - and it’s actually a great product. If Apple does enter the market, it could end up being a big influence. We also have Google Home and MS is also rumoured to be building a Cortana-powered device. This feels like a significant new market to me.

Apr 27, 2017

Oh Great, Now Alexa Will Judge Your Outfits, Too | WIRED

27 April, 2017

link

“They want to sell things, so telling people that they look good just the way they are is probably not what they’re going to do,” says Weisz.

Some possibly quite insightful analysis of the new Echo Look. I remain to be convinced this is a good idea.

Apr 27, 2017

Amazon unveils Echo Look, a selfie camera to help you choose what to wear | Technology | The Guardian

27 April, 2017

link

Amazon has unveiled the Echo Look, a new voice-controlled selfie camera pitched as the ultimate bedroom companion that allows AI assistant Alexa to give you fashion tips and tell you what to wear.

The camera uses the depth information to produce “computer vision-based” blurred backgrounds so you can apparently look your best in full-length selfies. It will also capture video, so you can give your audience a twirl in your finery.

OK, I didn’t see this one coming. I thinks it’s obvious this is not targeted at me. I struggle to comprehend why I would want to take hands-free selfies of myself every morning. Take a look at the video - baffling! But maybe I’m just not representative of everyone out there, so it’ll be interesting to see how this works out.

Navigate
« To the past Page 1 of 18
About
Darwin analysed ten thousand barnacles whilst developing his theory of evolution LINK. Maybe one day I will get to have published ten thousand posts about AI, Machine Learning and Bots. You can talk to me on Twitter LINK. Subscribe via RSS.