Many years ago I had a paper published on an implementation of Connectionless ATM. It described the design of a system for efficiently mapping IP (a connectionless protocol) to ATM (a connection-orientated protocol). At the time, this was an interesting area and there are quite a few other papers and patents on the subject. It looks like one group actually did build some hardware to implement something similar – check out Wormhole IP over Connectionless ATM here. ATM is now yesterday’s news of course, not even good enough for dodgy fish and chips (that’s a Brit reference – sorry ). TCP/IP is the dominant protocol in use and this consists of an end to end connection-orientated protocol (TCP) running over a connectionless layer (IP).
Something interesting happens though when you put a connectionless layer on top of TCP. It sounds totally stupid but actually has some interesting qualities. Essentially, TCP connections between elements become virtual links – transparent byte streams with which you can do anything. Plus it leverages decades of refinement in TCP itself so that it can deal with large transit delays, congested networks etc and degrade gracefully rather abruptly as is the case with protocols that merely drop packets when congested rather than perform rate control.
This is basically Software Defined Networking on the cheap! You can implement any switching and routing concept using the virtual links as the interconnects. This is the essence of ICOCO, the ultimate expression of Syntro…
Continue reading ‘The private cloud network infrastructure – how Syntro does it (hint – it’s essentially Software Defined Networking)’
Syntro video apps currently use Motion JPEG (MJPEG) to transfer video across the cloud. Most modern webcams (the Logitech C920 is one of our faves) can directly generate JPEG data instead of YUV data. Clearly, this makes it very easy to stream MJPEG since the compression has already been performed, even if there is no hardware support for compression on the processor itself. However, Syntro apps have to have the option to support video motion detection and only stream video and audio when something is happening – how do they do that without fully decompressing the data? We use a number of tricks to keep the processing overhead manageable even on small embedded systems such as the Raspberry Pi and BeagleBone Black, two systems we regularly use as sources of video and audio in Syntro clouds.
Continue reading ‘Efficient video motion detection – how Syntro does it’
GStreamer makes advanced video and audio compression accessible to everyone. Traditionally, Syntro has used motion JPEG for video and PCM for audio. There are many good reasons for this – simplicity, low processor overhead, quality of still images, universality etc. However, the amount of data that needs to be transferred and stored can quickly become prohibitive, especially when there are a lot of sources running full HD.
So it is logical to use video compression techniques that make use of temporal redundancy as well as spatial redundancy. However, a Syntro cloud has a unique and highly optimized architecture for multicasting live streaming data – essentially Syntro guarantees that data is only multicast at the point of delivery, minimizing the number of times that a packet is copied. While this works trivially for MJPEG/PCM, the same cannot be said for standard multiplexed MPEG4 and H.264. These codecs tend to assume that each endpoint has its own session which begins when the connection is made. What is needed for Syntro is a network layer mechanism that doesn’t require the switches to do any heavy processing and allows receivers to join or leave at any point with only local impact.
Continue reading ‘Multicasting GStreamer video and audio: how Syntro does it’
Published September 13, 2013
Private cloud , Syntro
Right now it seems everything is heading for the cloud – the public cloud that is. And there are many advantages for things such as scalable web services, global media delivery platforms etc. However, some applications seem to be taking things too far. When you are using a device on a LAN and streaming video from another device on the same LAN and it has to go via the public cloud, something is seriously wrong!
Streaming video (and any other high bandwidth service such as file sharing and remote file access) in this way causes two unnecessary traversals of the LAN’s access to the Internet. This makes no sense! For one thing, in residential situations, the cable companies are getting pretty vicious about data caps so unnecessary high bandwidth streaming is bad news from that perspective alone. Not to mention leaving less bandwidth for other applications that actually have to use the Internet.
It’s also easy to forget that the public cloud isn’t free. Someone is paying for the compute time and at least one direction of data transfer. Consequently it’s important to understand what’s being done with the data that’s bouncing off the public cloud…
So what’s the solution? Private cloud technology like Syntro makes it possible to implement the same kinds of services as exist in the public cloud but with advantages such as data localization (i.e. keeping the data only where it needs to be), data privacy (nobody else gets to see your data) and the ability to process and mine that data at your leisure and without third-party services. Sure, the public cloud is useful. But the private cloud in partnership with the public cloud makes everything more efficient and puts you back in control of your data.
Published September 11, 2013
Github , Yocto Project
Make sure you have SSH keys for Github setup first, so that you can do this from the command line on the build machine
git clone email@example.com/Pansenti/SymotesDB.git
Then the SRC_URI to use in the bitbake recipe is
Pansenti is pleased to announce that symotes is powered by Syntro, the Lightweight Compute Cloud. symotes is a breakthrough in the use of a private cloud system to enable smart sensing of spaces with embedded intelligence.
symotes empowers anyone to convert the places where they work, live and play into fully sensed environments. Authenticated users can access real-time and historic data via a web browser on any modern PC, smartphone or tablet. Real-time intelligence processes data feeds from multiple sensors and integrates them into actionable information. Real-time or historic video, audio or other sensor data can be viewed from within the space or remotely via the Internet as needed.
The use of a private cloud endows the symotes system with a number of benefits. The sensed data from a space remains within that space and is only available remotely via the Internet to suitably authenticated users over encrypted links. The symotes system works purely within the LAN environment so completely eliminating unnecessary loading on the Internet access link (which in many cases is subject to data caps that prevent real-time, 24/7 streaming to a public cloud service).
For more information on symotes and upcoming products, check out symotes and sign up for Twitter updates at @symotes.
Published August 5, 2013
We ordered a couple of Wandboard Dual boards awhile back, but hadn’t really done anything with them other then install Ubuntu and make sure they booted.
The Wandboards use i.MX6 Cortex-A9 processors from Freescale. The Wandboards come in Solo, Dual and Quad core versions all running at 1GHz. For memory they have 512MB, 1GB and 2GB of DDR3 respectivelly. All of the boards come with GigE ethernet. The Dual and Quad versions come with additional WiFi(802.11n) and Bluetooth radios.
Continue reading ‘Wandboards!’
Anyone who uses apps such as Runkeeper will know the problem – it’s really annoying to have to find your smartphone to start and stop it, change settings etc. And it’s not just that, it’d be nice to control other functions without having to actually look at the screen. Well HOT Watch seems to be one of the first devices that really addresses this problem. While they don’t particularly mention it, the HOT watch should also mean that you don’t have to carry a phone about in your house any more if the BT4.0 range is good enough. It’ll be interesting to see how well it works in real life.
Now we just need someone to produce a tiny smartphone with a small, minimal functionality display that relies on remote I/O devices…
Published July 24, 2013
We recently decided to add web interfaces to some of our Syntro applications. Syntro is a C++/Qt framework and we wanted a library that would work well with existing code and yet minimize our web application learning curve.
We came upon Wt a C++ Web Toolkit
Continue reading ‘Choosing Wt Web Toolkit to develop Syntro frontends’
Some more detailed instructions on using the latest meta-pansenti Yocto layer to build Gumstix and BeagleBone systems can be found on the jumpnow site.
Building Gumstix images with the Yocto Project
These systems come with the SyntroCore libraries and headers installed and SyntroLCam, a web camera application, running on startup.
The title says Gumstix, but we use the same meta-layer for BeagleBone and Wandboard systems.