integration through swift

The upcoming release of Rigelian will focus on the artist view: a more image based view, improved handling of artist images and the addition of artist biographies and similar artists.

Biographies and similar artists will be retrieved from, which offers an api for this. To abstract the networking part I created a new package that encapsulates the api calls into nice observables.

let lastfmApi = LastFMApi(apiKey:"your key") "Taylor Swift")
  .subscribe({ onNext: (result) in
     switch result {
       case let .success(info):
         print("Info: \(info)"
       case .failure(error):
         print("Error: \(error)")
  .disposed(by: bag)

The package is available on GitHub:

CI/CD with Bitrise

To make the development process more reliable, I added 4 of the open-source libraries on my GitHub account to Bitrise. For small developers they offer a free service to have automated build pipelines for different platforms, including the Apple ecosystem.

Every commit will now trigger the build pipeline including running the test set (if present, which is currently not the case for part of my code). At least it will trigger in case existing builds fail. The build status will also be presented on the repository page.

There were some dependencies in 2 of the libraries on UIKit, which were causing build issues. The easiest way to solve that turned out to be to remove that dependency, and instead rely on Foundation only. That also means that the swift packages now (at least in theory) support all the Apple platforms: iOS, macOS, watchOS and tvOS.

Rigelian 2.0 is in the mail

I just completed the last bits-and-pieces and posted the new 2.0 version of Rigelian to the AppStore and into the review and approval process.

When I started the development of Rigelian over 2 years ago, I had a couple of things in mind. The main goal was to create the best remote control for mpd-based players. That meant it had to be easy to setup and use (hide configuration where possible), very responsive and great looking.

But next to that I also had technical goals I wanted to achieve. To make sure that extending and supporting the app would be possible with a reasonable amount of effort, I started from scratch using the latest that Apples development eco-system has to offer. That meant Github, swift, RxSwift, cocoapods and later swift package manager, and many of the great open source packages made available by the swift developer community. And bye-bye to Objective-C after almost 30 years (yes, really 30 years, I did my graduation project in computer science back in 1990 using Objective-C on a DEC Ultrix machine).

And I wanted to use the experience I collected over the last 10 years building music remote controls: first MPoD and MPaD, and later the iOS controller for the Bluesound system. I set out to abstract the generic concepts of a music remote control, and create a front-end that could operate completely independent of the player it is controlling. For that I created ConnectorProtocol which defines a generic way to communicate with a player regarding discovery, control, browsing and status updates. The front-end will only know about the protocol, and have zero knowledge of the specific implementation. The first implementation against the protocol was MPDConnector, which has been the basis for Rigelian up-to 1.7.2. And for the last half year, in parallel to further developing the front-end I created a second implementation, this time for kodi: KodiConnector.

And by now this has reached the point that it’s actually possible to perform the main things you want to do with a music remote, and I’m releasing my newborn into the world. I’m very happy with the result where I can maintain a single code base for the front-end, and seamlessly control 2 completely different kinds of players with an app that looks exactly the same in both cases. It will also mean that future improvements like Siri support will be available for both types of players with almost no additional effort. How nice is that.