22 maggio 2010

A new site!

If you're trying to get to this site via its former alias http://www.paolomanna.com, you'll be surprised to find yourself in a completely different place!
I'm now building up there the new site as a technical reference for my work: this site will stay as it was originally intended, i.e. some technical musings in my native language. The interesting posts in English have been reported there, so nothing is lost!
See you soon at my new site: for now, please excuse the mess, I'm still cleaning up the place...

28 febbraio 2010

QC LuaPlugin - now with a (slightly) better editor and images

In February I've worked now and then on my Lua plugin for Quartz Composer.

The Good
  • Courtesy of Noodlesoft, the editor is nicer, and shows line numbers and a marker when an error is detected.
  • There is now a sort of experimental support to image type: still no way to interpret them (and, anyway, why?), but they're properly recognized and passed around
The Bad
  • I've also added a composition to do some simple benchmark, and the result is that Lua lags behind a properly written JavaScript (tests done on 10.6)... Not surprising, given that Lua is standing (almost) still, while Apple has put a lot of efforts in JavaScriptCore lately! The lack of a 64-bit LuaJIT is starting to hurt...
The Ugly
  • To implement the images, I've had to use undocumented APIs (the QCImage class isn't public), as there was no (simple) way to get what I needed in the "Official" one...

29 gennaio 2010

Hidden Gems of Snow Leopard: IOSurface on Google Code

The sample code, cleaned up and tested a bit, has a new home. Enjoy!

Previous posts:

QC LuaPlugin Updated

I've finally found the time to review the pending issues left in my Lua plugin for Quartz Composer. The things that were left out are now working, plus I've added a bit of debugging help at the source code level. Still, the target to get a working JIT version (that would speed up scripts) is hold back by the fact that there is no 64 bit LuaJIT yet...

15 ottobre 2009

Hidden Gems of Snow Leopard: IOSurface (with video in)

Video input

I've modified the command line tool to accept some more params, in order to use as frame source a video input. The code, thanks to the QTCapture framework, is pretty straightforward, and it's interesting to note that very little was needed to get it working. Clearly, although with little documentation, IOSurface integrates perfectly in the existing technologies!


Previous posts:

10 ottobre 2009

Hidden Gems of Snow Leopard: IOSurface (again)

The QCPlugin

To extend the previous sample, I've now added a Quartz Composer plugin that spawns the CLI application: it's also possible to choose at compilation time (through a #define) if the image has to be provided to QC as a GL texture or a pixel buffer.
A sample composition has been included in the code.

Notes
  • The embedded CLI application is set as a dependency for the other 2 targets in the project, and should be compiled automatically: however, in certain cases it appears that Xcode "forgets" to apply the build flags, and tries to compile in 64 bits (that fails, as Quicktime doesn't exist in that universe). To solve this issue, compile the IOSurfaceCLI target separately.
  • A bug seems to affect Quartz Composer whenever a movie is started and stopped multiple times: the projection matrix, for some reason, isn't reset, and the frame appears much bigger than it should. A bug report has been posted to Apple.


Other posts on the same argument:

25 settembre 2009

Hidden Gems of Snow Leopard: IOSurface

Snow Leopard may have looked not so different from its predecessor from the average user point of view: however, for developers like myself, a lot of things have changed, some well advertised (say, GCD, OpenCL and 64 bits), some still to discover. A good overview can be found at Ars Technica.
As one of the long overdue issues, not to mention all the limitations derived from its venerable age, Apple has introduced a first step to the future of the old Quicktime (that will stay in 32 bit universe) with the new Quicktime X: however, Quicktime isn't so easy to replace in one shot, and it's still present in the system, transparently invoked by Quicktime X (or, for us developers, by QTKit) whenever it's needed.
But, how can a 64-bit software (like the Quicktime X Player, or the Finder itself) use a 32-bit library? The answer is, it doesn't, the technique used behind the scenes is far more interesting: when a 64-bit software needs a frame from a movie it can't process otherwise, a specific software is launched (you'll see it in the Activity Monitor as QTKitServer-(process-ID) process-name) that gives back the frames to the 64-bit app.
Hey, isn't that nice? Graphics passed from one process to another, how can they do that? The answer looks like it's in a new framework, IOSurface.

Disclaimer: the following statements are the result of personal experimentation: as such, they don't represent any official documentation nor endorsement.

IOSurface is included in the new public frameworks, but no mention of it exists in the official documentation: looking at the various C headers, however (not only in IOSurface.framework, but overall in the graphics libraries - Spotlight's your friend), it's possible to have a glimpse at its capabilities.

Putting together some sample code

A good example of how IOSurface works could be a quick and dirty implementation of a QTKitServer-lookalike, that plays any Quicktime movie in a 32-bit faceless application, and its 64-bit companion that shows the frames in an OpenGL view. More in detail, a IOSurface can be attached to many kinds of graphic surfaces and passed around to different tasks, that makes it the perfect candidate for our own version of a QTKitServer-clone. The link to the Xcode project is below - 10.6-only, of course.

The faceless movie player

Now, let's see how to create frames on IOSurfaces. For a start, we can create Core Video pixel buffers (one of the possibilities to define an offscreen destination for QT movies - see the excellent QTCoreVideo sample projects) that will have IOSurfaces bound to them: when we create the QTPixelBufferContext, introducing the proper items in the optional dictionary will instruct Core Video to attach IOSurfaces to each pixel buffer we'll get back. Each CVPixelBuffer we'll get from Core Video will then be asked for the related IOSurfaceRef: IOSurfaceRefs are the references to use inside the same application, and each surface has also a unique IOSurfaceID that can be referred to in other processes to obtain a local IOSurfaceRef.
For the sample I've put together, I've used the simplest way of passing IOSurfaces, i.e. asking them to be created global and passing around the IDs: not the ideal solution in the long term, but the other way (i.e. passing them through Mach ports) looks more complex and prone to errors to implement without the docs.
The small CLI app gets as the only argument the movie to play, and passes back the IDs of the surfaces through a simple pipe. Using the kIOSurfaceIsGlobal option puts also a limit in the consumer side: as the CLI doesn't know anything about the consumer, surfaces will be reused as soon as possible, so they'll have to be consumed at once. Binding them to Mach ports, however, would force the framework to create new surfaces until the previous ports are deallocated.

The 64-bit GUI application

Our 64-bit app is a very simple GUI: nothing really special here, a movie is chosen and passed to our faceless app launched in background as a NSTask, whose output is captured and parsed for IOSurfaceIDs. The interesting part is in the few lines that get the IOSurfaceID and build up a texture that we can use: the new call is CGLTexImageIOSurface2D, that is meant to be the IOSurface equivalent of glTexImage2D used in regular OpenGL to upload images.

Notes
The code is only good as a demo of the capabilities and for experimenting, in many aspects a real-world solution will use very different techniques!


Other posts on the same argument:

06 ottobre 2007

Citrix su Mac: ICA vs. RDP

Il confronto, stavolta, si fa ad armi (quasi) pari, e non è un caso: l'implementazione pratica di RDP sono i Terminal Services, presenti sulle versioni server di Windows a partire da NT 4, che hanno le loro radici proprio nella versione di allora (WinFrame) di quello che poi è diventato Presentation Server, come ho già illustrato. Nel tempo, gli sviluppi hanno mantenuto percorsi paralleli, con RDP normalmente un po' più indietro sia in termini di caratteristiche che capacità. Fino a RDP 5.2, che fa parte di Windows Server 2003, abbiamo quindi che entrambi:
  • Utilizzano un protocollo compatto ed ottimizzato sul sistema grafico Windows (GDI)
  • Permettono il reindirizzamento sul client remoto di una serie di device (disco, porte COM, stampanti, audio, clipboard), oltre naturalmente a quelli classici, ovvero schermo, tastiera e mouse

Con RDP 6, che però verrà effettivamente in uso solo con Windows Server 2008, incomincia ad apparire quello che invece fino ad ora erano state capacità esclusive di Presentation Server, ovvero:
  • Esecuzione in remoto di singole applicazioni, non di tutto il desktop
  • Solo la finestra dell'applicazione effettivamente "pubblicata" viene visualizzata sul client, non l'intero schermo, venendo quindi a tutti gli effetti ad integrarsi alle altre finestre dell'ambiente remoto (modalità Seamless, ripresa di recente sui Mac Intel anche in contesti simili, come ad esempio Parallels Desktop col nome Coherence e VMware Fusion con Unity)
  • Maggiore supporto per altri device remoti
  • Miglior gestione dei font
Le differenze, sulla carta, si riducono notevolmente: Presentation Server ha ancora molti vantaggi, come una migliore scalabilità, migliori ottimizzazioni su reti non veloci, e la possibilità (solo su Windows, ovviamente) di decidere di trasferire l'applicazione pubblicata sul client remoto per l'esecuzione, liberando risorse sul server.

Post precedenti:
Citrix su Mac: glossario
Citrix su Mac: un po' di storia
Citrix su Mac: ICA vs. VNC (e Apple Remote Desktop)
Citrix su Mac: ICA vs. X Windows

Citrix su Mac: ICA vs. X Windows

A prima vista, questo sembra uno strano confronto: cosa avranno mai a che fare un protocollo di comunicazione per applicazioni remote, tipicamente Windows, ed un sistema grafico normalmente usato su Unix?
In realtà, ci sono alcuni punti in comune: entrambi permettono la visualizzazione di un'applicazione su un display collegato ad una macchina separata, anche se in rete, rispetto a quella su cui l'applicazione stessa gira, con la relativa gestione e reindirizzamento degli eventi dalla postazione remota, ed entrambi utilizzano protocolli molto più compatti ed efficienti delle semplici bitmap (per quanto compresse ed ottimizzate) che sono alla base di VNC. Le similitudini finiscono qui, però: Presentation Server ha tutta una serie di caratteristiche avanzate, dal supporto di diversi canali di comunicazione fino alle varie ottimizzazioni del protocollo, che non trovano un corrispondente in X Windows. Per quello che riguarda esigenze semplici, in un ambiente prevalentemente Unix è certamente più efficace utilizzare X Windows rispetto a VNC, visto che i due hanno praticamente le stesse caratteristiche, mentre in un ambiente misto VNC ha il grosso vantaggio della più ampia gamma di sistemi operativi supportati, se si può ovviamente sorvolare sulle sue altre limitazioni.
Per completezza, riporto anche che esiste una versione di Presentation Server per Unix: rispetto al solo X Windows (sistema grafico su cui ovviamente si basa), può essere utile in quei casi in cui si debbano condividere più dello schermo, mouse e tastiera, o si abbia bisogno delle ottimizzazioni per reti lente che sono una delle caratteristiche più apprezzate di ICA.

Post precedenti:
Citrix su Mac: glossario
Citrix su Mac: un po' di storia
Citrix su Mac: ICA vs. VNC (e Apple Remote Desktop)