2 weeks ago I got an idea for a new app, which combines 2 technologies: VisionKit which supports live text recognition, and MusicKit which supports integration of AppleMusic into an app. I got the idea after seeing the great app Cibo, which lets you scan a restaurant menu, and shows images of the dishes on the menu.
The app is named Tunica, with the tagline listen to what you read about. It’s like Shazam for text: point your camera at an album title or artist name, and immediately you will see a list of matches from AppleMusic, and listen to them in the app.
With the help of some very good examples from Apple, this turned out to be quite easy and I managed to complete the implementation in less than 20 hours, including setting up supporting web-pages and having it reviewed and approved by Apple. All of that in time to have it available to users at the same time that iOS 15 is released.