Hello everyone, hope you all enjoyed your summer so far!
Our next meetup will be taking place Wednesday August 20th, on the 3rd floor at Notman House. (Same awesome space where we held our Android Wear hackathon early this summer.)
We have 2 great talks scheduled, and a small recap / show and tell session for the various toys we managed to bring back from Google I/O. As always you’re welcome to drop by to introduce yourself and your projects to the community.
- 17h30: Google I/O Recap
- 18h30: Recognizing Speech on Android
- 19h30: Byte-Code Weaving on Android
- 20h30: Beer and food, @Ye Olde Orchard Pub
You can get your tickets on our Eventbrite.
Google I/O 2014 Recap
Small informal presentation reviewing what was announced at Google I/O, with small hands-on + Q&A session for:
- Android Wear
- Android TV
- Google Cardboard
Recognizing Speech on Android
by Gina Cook – Android Developer for iLanguage Lab
In this talk I’ll show you how to use speech recognition in your own Android apps. The talk will have something for both beginner and advanced Android devs, namely I will show two ways to do speech recognition: the easy way (using the built-in RecognizerIntent for the user’s language) and the hard way (building a recognizer which wraps existing open source libraries if the built-in RecognizerIntent can’t handle the user’s language). While I was in Batumi my friends and I built an app so that Kartuli users (code) (slides) could train their Androids to recognize SMS messages and web searches. Recognizing Kartuli is one of the cases where you can’t use the built-in recognizer. Kartuli spoken by only 4 million people in the country of Georgia: that is roughly the population of Montreal and surrounding areas. The talk will start with a demo of our Kartuli trainer app to set the context for the talk, and then dig into the code and Android concepts under the demo.
- How to use the default system recognizer’s results in your own Android projects,
- How to use the NDK in your projects,
- How to use PocketSphinx (a lightweight recognizer library written in C) on Android
Byte-Code Weaving on Android
by Stéphane Nicolas – Senior Android Developer for Groupon Canada
During the last few years, we have all assisted or participated to a small
revolution on Android : the emergence of annotation processing. Many
popular libs now rely on annotation processing during build to pre-compute
a lot of stuff that will boost them at runtime. Let’s think about Dagger,
ButterKnife, Hugo, RoboGuice, BoundBox, Memento, IcePick, etc.
All those libs use annotation processing to generate data as Java source
files that are executed at runtime. Nevertheless, writing annotation
processors is far from obvious as they are based on the annotation
processing API that is very difficult to understand.
But there is an alternative : byte code weaving. This technique is popular
in the Java World for a large variety of applications : from creating mocks
(easymock/mockito) to data persistence (Eclipse Link, EBeans, etc.), or
Aspect Oriented Programming (AOP).
So, why don’t we use post-compilation processing on Android too ?
This sessions offers an overview of different byte code weaving techniques
and will examine whether they can be applied to Android. We will see how
post compilation byte code weaving can be used during Android apps’ builds
when using either Gradle or Maven.
Finally, we will present 2 libraries : AfterBurner & Mimic. They are based
on javassist, and aim to make byte code manipulation easy for Android. We
will go through simple use cases that will show us how to eliminate all
boiler plate when using our favorite libraries and boost apps using post
compilation byte code weaving. Byte code weaving could even be used to
normalize new and simpler programming techniques for Android that could get
standardized via Android Specification Requests.