Swapnil talks with Ryan Sipes, CTO of Mycroft AI, to learn more about the Mycroft project and why they chose to open source the Adapt parser.
Sourced through Scoop.it from: www.linux.com
The future is artificially intelligent. We are already surrounded by devices that are continuously listening to every word that we speak. There is Siri, Google Now, Amazon Alexa, and Microsoft’s Cortana. The biggest problem with these AI “virtual assistants” is that users have no real control over them. They use closed source technologies to send every bit of information they collect from users back to their masters. Some industry leaders, such as Elon Musk (Tesla, SpaceX), are not huge fans of AI. To ensure that AI will not turn against humanity and start a war, they have created a non-profit organization called OpenAI. But, Linux users don’t have to worry about it. A very ambitious project called Mycroft is working on a friendly AI virtual assistant for Linux users. I spoke with Ryan Sipes, CTO of Mycroft AI to learn more about the product. The Humble Beginning When Ryan and Mycroft co-founder Joshua Montgomery, who owns a makerspace, were visiting a Kansas City makerspace called Hammerspace, they found someone working on an open source intelligent virtual assistant project called Iris. Although it was a really neat technology, it was very simple and basic. Ryan recalled that you had to say exactly the right phrase to trigger everything. The two were interested in the technology, but they didn’t like the way it had been built around a very rigid concept. ryan-sipes Ryan Sipes, CTO of Mycroft AI They figured that somewhere, someone was already doing something similar, so they hit the Internet and actually found many projects; some were dead and many others were approaching the problem in a way not suitable for the two entrepreneurs. They even tried Jasper, but despite being developers, they had hard time getting it to run. All they wanted to do was make an intelligent system for makerspace. Nothing fancy like Amazon Echo. Just a speaker hanging from the wall allowing users to do things through voice. People could ask, for example: “Where is the hammer?” and it would tell them; or you can tell it to turn the lights off in a particular room. That’s all they wanted. So, they resorted to building their own, and when they got their software ready, they realized that it was really slick. It could be used at home and office to do many things. Initially, they didn’t have any product in mind, but they decided to take it public and convert it into a product. Ryan and Josh are serial entrepreneurs, so funding the project themselves was not a problem; however, they chose to go the crowdsourcing way. “The main reason behind going to Kickstarter was market validation. We wanted to see whether there was any interest in such a product. We wanted to know if people were willing to invest money in it. And the response was overwhelming,” said Ryan. Additionally, they decided to make all of this work open source. They used open source software, including Ubuntu Snappy Core, and open hardware, such as Raspberry Pi 2 and Arduino. The public mandate was already there. There was a demand for the product. The Mycroft project raised more than $127,520 on Kickstarter and another $138,464 on Indiegogo. Once the project was fully funded, Mycroft set aside around half of the money to fulfil the Kickstarter hardware requirements and the rest of the money was used in finishing the development effort. Going Open Source Earlier this month, the developers released the Adapt intent parser as open source. When many people look at Mycroft, they think voice recognition is the important piece, but the brain of Mycroft is the Adapt intent. It takes natural language, analyzes the ultimate sentence, and then decides what action needs to be taken. That means when someone says “turn the lights off in the conference room,” Adapt grabs the intent “turn off” and identifies the entity as “conference room.” So, it makes a decision and then reaches out to whatever device is controlling the lights in the conference rooms and tells it to turn them off. That’s complex work. And, the Mycroft developers just open sourced the biggest and most powerful piece of their software. “The only way we can compete with companies like Amazon and Google is by being open source. I can’t see how we could compete with them if we had only the resources we had to work on this. Just in house, we have probably like 5 people total, so there is no way we could compete with 100% team of those big companies. But, the cool thing is 20 minutes after the adapt code was released, we had a pull request. We had our first contribution,” said Ryan. Going open source immediately started paying off. Something even more incredible happened. Just an hour after the release, core developers of the Jasper project had already downloaded the code, cloned it, forked the repo, and started working on it. So, now you have more brilliant people working on the same software to make it even. Nowhere else but in open source will you see “competitors” working together on shared technologies. Ryan recalls an interesting conversation with business people who don’t understand open source model. “When we talked to business guys and they ask what’s the point of going open source instead of proprietary, I explained it in this way: I spent no money and my software improved within 20 minutes of release, and then those business guys get it.” Going open source goes beyond small patches from contributors. It makes a project richer. Ryan said that when he talked to his family and friends about it, they would say: make it do this, make it do that. And these were not things that either Ryan or other team members had thought of. The open source development model allows other people with different ideas to do exciting things with the project. Ryan says that they see Mycroft software going beyond the hardware. It’s also Linux’s best chance at getting its own Siri, Cortana, and Alexa. Because Canonical and Mycroft are working together, there is a possibility that Ubuntu phones, tablets, IoT devices, and even the desktop may use Mycroft as their AI virtual assistant. Then, it could be used in games and robots. I actually see real potential in cars. You could use it for navigation, ask for weather, traffic situation, control your music, open and close the sunroof, windows, and so much more. And, because it’s an open source project, anyone can get involved. I wish I were able to tell my Linux desktop, “Mycroft, open the community page of Mycroft!”