Hueblog: Philips Hue API: Many possibilities remain unused

Philips Hue API: Many possibilities remain unused

Developers are being slowed down

About two years ago, Philips Hue introduced a new API. API stands for Application Programming Interface and is an interface through which apps can send commands to the Hue Bridge to perform certain actions. In development, however, there are still major restrictions, as Stefan Göhler, the developer of iConnectHue, told me in conversation.

In principle, developers of third-party apps, including Hue Essentials or All 4 Hue, have access to the new interface. Otherwise, it would not be possible for them to control new products, such as the Gradient light bulbs.


Scripts are not yet sufficiently documented by Philips Hue

With the Philips Hue contact sensor, a new sensor has now appeared for the first time since the introduction of the new API interface. So far, so-called rules were used for the accessories, which could be programmed very flexibly. With the new interface, however, Philips Hue relies on scripts – and this is where the problems for third-party providers begin.

Currently, it is not yet possible for people like Stefan Göhler to build their own scripts. Only existing scripts can be used, which are also not yet documented. This also means: with the new contact sensor, third-party apps can only offer the functions that have already been scripted by Philips Hue.

There would be so many possibilities

And yet everything could be so beautiful. If developers had access to the scripting or behaviour API of Philips Hue, they could massively expand the capabilities of the Hue system.

For example, it would be possible to use a switch on the first bridge to trigger a script that controls lamps on the second bridge. Even actions outside the Hue system, such as controlling a Sonos speaker, could be realised in this way.

So far, there are no signs that the current situation will change in the near future. Especially for users who are more intensively involved with the system, exciting options are lost in this way.

Note: This article contains affiliate links. We receive a commission for purchases via these links, which we use to finance this blog. The purchase price remains unchanged for you.

Check your local Philips Hue Online Store for availability of Hue products in your country. There you will also find all the technical information and prices.



In den letzten Jahren habe ich mich zu einem echten Experten in Sachen Hue & HomeKit entwickelt. Mittlerweile habe ich über 50 Lampen und zahlreiche Schalter im Einsatz. In meinem kleinen Blog teile ich meine Erfahrungen gerne mit euch.

Comments 2 replies

  1. This is definitely a shame, I’m surprised Signify haven’t used this to produce a pro bridge that effectively binds a number of bridges together and makes them function and appear like a single bridge allowing you to expand your hue system with fewer barriers, but if there’s no real scripting I guess it’s not possible, maybe even for Signify right now.

  2. The programmable rule engine in API v1 was simple, with a single representation for all rules. A rule was made of a generic list of conditions the bridge would evaluate and of a generic list of actions to perform in case the conditions were met. With that, it was possible to define custom behaviors beyond what the official Hue app supported. The main limitation (imo) was the relatively narrow range of possible conditions and actions and the ambiguity between triggers and conditions.

    With API v2, they should have evolved this programmable rule engine to address those restrictions, making it a bit more developer-friendly and powerful. Instead, they introduced something that isn’t proper REST resources. We now see this concept of “behavior scripts” that is exposed, kind of metadata about internal scripts and internal JSON schema developers can’t even access. Having them as API resources makes little sense, they should be in the official API documentation itself.

    On top of that, developers are expected to create “behavior instances” to configure devices. These contain a piece of configuration contrained by the JSON schema from the “behavior script” of that specific device you’re trying to configure. This means you have to understand and work with multiple schemas, like for instance one for configuring contact sensors and one for tap dials. And if Signify decides the contact sensor can’t use time-of-day triggers or dark/not-dark logic (which appears to be the case), then you simply can’t implement those features in your app.

    To me this shows the team at Signify needs to refine their approach to designing APIs, especially one meant for external consumption. This API is clearly designed in priority for them and their own app. And while the official app has improved significantly, they’ve made it very hard for third-party developers to build anything different or more advanced.

    @Signify: don’t forget a hallmark of a good API is the freedom for developers to craft custom scenarios that might not have been envisioned by the original creators (you). Also, don’t ask developers to migrate to a v2 that offers fewer capabilities than the v1. Provide a migration path first.

Leave a Reply

Your email address will not be published. Required fields are marked *