Chef de Cambuse, cruising and wine

Archive for January 2007

XBox 360 UPnP client and Coherence in communication therapy

written by Frank, on Jan 31, 2007 6:25:00 PM.

The XBox 360 UPnP client is a little bit peculiar and finicky regarding its dealings – well, actually it only wants to talk to its really close friends, the Windows Media Player 11 and the Windows Media Connect server.

With a lot of patient care from Tristan Seligmann and a yet unknown contributor we managed to win its confidence and now that poor little thing has started to talk via UPnP to Coherence – and requests so far audio and image files from the MediaServer, even from the Flickr backend.

To be honest, it was more the devotedness of Coherence than the willingness of the XBox that made this happen, especially when it came to the UPnP idiom spoken in the suburbs of Seattle, but anyway it is a nice thing to see this cagey behaviour come to an end.

And what’s even greater to see is Coherence being of real use for somebody outside and furthermore getting something back!

accessing Flickr the UPnP way

written by Frank, on Jan 20, 2007 9:46:00 PM.

As a follow-up to my last post about the GStreamer backend for Coherence I’m happy to announce now the Coherence Flickr UPnP A/V MediaServer backend.

What does it do:

  1. it connects to Flickr via the Flickr API and enquires a list of interesting photos for the most recent day
  2. it provides that list via the ContentDirectory service to other UPnP A/V enabled devices

So if you have such a multi-functional home-automation touch-panel like mine or one of these tvs with a build-in UPnP A/V client you can now enjoy an ever changing stream of photos coming to your living room.

Of course there will be Flickr enabled picture frames popping up from every manufacturer climbing on the bandwagon - which will definitly have their right to exist, especially if they are used in a isolated environment - but if you are in a connected home it sounds unreasonable to supply configuration to every device all over the place.

Hence the charm of these few lines of Python code lies in its possibility to make Coherence the one-stop-shopping media gateway.

I'll add a few more features over the next days:

  • enquire user, group or tag based photo lists
  • optionally only show photos in landscape format
  • define picture quality to download
  • make the number of photos returned configurable, currently it is limited to 100
  • recheck with the Flickr service every e.g. 180 minutes and fetch new photo urls
  • maybe act as a proxy to Flickr for devices which are not able to connect to something outside the home lan

And something I would like to add too, but this will take some more time, is a mapping between the UPnP ContentDirectory ImportResource() and CreateObject()/MoveObject()/… actions to the Flickr upload API.

glueing together GStreamers playbin and Coherences UPnP A/V MediaRenderer

written by Frank, on Jan 11, 2007 2:28:00 PM.

In an attempt to work on the documentation and as a practice finding the words to explain how Coherence ticks I'll try to throw some light on how the GStreamer playbin backend and Coherence are glued together to form an UPnP A/V MediaRenderer device.

A device in UPnP speak is a logical something - a grouping of services. A MediaRenderer as an example needs to support

  • a RenderingControl service, which is responsible for adjustment of volume, loudness, brightness,…
  • a ConnectionManager service, used to enumerate and select the transfer protocol (http, rtsp/rtp,…) and
    data format (mp3, ogg,…) to be used for transferring the content
  • an AVTransport service, optional - depending on the transfer protocols the ConnectionManager supports. But if available used to specify the content to be played, start/stop/pause a playback, seek in the content stream,…

These services can be controlled and queried with so called Actions, e.g. SetVolume() and GetVolume(), and hold their current state in so called StateVariables, e.g. Volume. A call to SetVolume() sets the volume value on the actual rendering engine and reflects that change in the variable Volume. And a call to GetVolume() retrieves the value from that variable. Furthermore a service is supposed to propagate the values of these variables on change to interested parties, e.g. to an UPnP ControlPoint.

Now one of the basic principles of Coherence is to seal off the backend – the actual rendering engine, in this case the GStreamer playbin – from the UPnP related tasks (and later from the D*AP ones) as much as possible.

This means that a backend should only need to

  1. tell Coherence what kind of device(s) it wants to resemble, e.g. a MediaRenderer
  2. define specific conditions, e.g. that is supports only decoding of mp3 content
  3. provide methods Coherence can access when an action is called that has effects on the backend engine, e.g. a volume change
  4. inform Coherence about state changes, e.g. the end of the media content is reached and playback has stopped

Everything else is the job of Coherence:

  • propagate the device on the network (SSDP/MSEARCH)
  • generate the necessary device and service descriptions with reasonable defaults, corresponding to the backends capabilities
  • provide the control interfaces to the actions the services have
  • handle all event subscription and propagation tasks (GENA)

This means also, that all actions with no effects on the backend engine are handled completely within Coherence, like an action that retrieves only a variable value. Of course a backend can override this if it wants to, but basically there is no need to do it.

In our GStreamer playbin case this (simplified) looks like this:

class Player:

    """ see item 1. above """
    implements = ['MediaRenderer']

    """ see item 2. above """
    vendor_value_defaults = {'RenderingControl': {'A_ARG_TYPE_Channel':'Master'},
                             'ConnectionManager': {'SinkProtocolInfo', 'http-get:*:audio/mpeg:*'}}
    vendor_range_defaults = {'RenderingControl': {'Volume': {'maximum':100}}}

    def __init__(self, device):
        self.player = gst.element_factory_make("playbin", "myplayer")
        self.device= device

    def set_volume(self, volume):
        """ here the actual volume change takes place """
        """ gstreamer playbin has a volume range from 0.0-10.0 """
        self.player.set_property('volume', float(volume)/10)

         """ feed back the new state - our item 4. """
        rcs_id = self.device.connection_manager_server.lookup_rcs_id(self.current_connection_id)
        self.device.rendering_control_server.set_variable(rcs_id, 'Volume', volume)

    def upnp_SetVolume(self, *args, **kwargs):
    """ that's a method referred in item 3. """
        InstanceID = int(kwargs['InstanceID'])
        Channel = kwargs['Channel']
        DesiredVolume = int(kwargs['DesiredVolume'])
        return {}

This is not perfect and not final yet and there will be some changes especially in the way how item 4. is handled. But the purpose of that backend is among other things to straighten out the bumpiness of these glue points.

And btw., in this backend the engine is embedded, but as stated in an earlier post, there is absolutly no reason why instead of calling self.player.set_property to set the volume this couldn’t be some ipc call (xml-rpc, dbus, Twisted PB,…) or an IR/ZigBee/ZWave signal emitted.

A GStreamer based UPnP MediaRenderer with Coherence

written by Frank, on Jan 10, 2007 3:34:00 PM.

Over the last days I worked on the GStreamer  UPnP A(/V) MediaRenderer backend for Coherence.

The current objective is to obtain a certain base functionality that will allow us to test the renderer in more real-life scenarios and detect the points of contact between the MediaRenderer backend stub and the real media ‘device’, like the GStreamer playbin here or a MediaCenter like Elisa.

So far we can

  • access an audio file on an UPnP MediaServer
  • start, stop and pause playback
  • mute/unmute and change the volume

Controlling with the Intel AV Media Controller works reasonably well, although there is some issue with the volume control, it seems that the allowedValueRange for the volume is not respected by the controller.

I’ll  add the seek functions and the SetNextAVTransportURI action next, and then it is back to the MediaServer, trying to add media upload functionality (CreateObject/ImportResource) to it.