[vlc-devel] OpenHMD branch?
lomax at clickworkorange.com
Mon May 20 00:57:42 CEST 2019
I've just tried cleanly rebuilding everything once more, this time using
a manual build of vlc/contrib/libplacebo. The shipped verion (allthough
the same version number) does not seem to include shaderc support, so I
downloaded/built/installed that first, then libplacebo, then VLC. I'm
basically trying to retrace every step I took on the previous
Celeron/Intel UHD machine, where it did work (albeit at slideshow
speeds). Alas, this seems to have made no difference. I'm almost
beginning to suspect that the change of GPU is the root of my problem...
Some concrete questions:
Does VLC-HMD require libplacebo built with shaderc support?
What is the role of libplacebo?
What is the role of assimp?
Are the VirtuaTheatre files needed for stereoscopic 360 playback?
Are the following runtime errors possibly related to my problem?
- chain filter error: Too high level of recursion (3)
- main filter error: Failed to create video converter
- main vout display error: Failed to create video converter
- main vout display error: Failed to adapt decoder format to display
- main video output error: video output creation failed
- main decoder error: failed to create video output
I suspect not, because I do remember seeing them before - and there *is*
video output, just not in the right format.
Is there any debugging method I can use to find out more - e.g. should I
build VLC with some debugging option?
And finally, the main question: What might be the reason that the video
displays as a rectangular mono panorama instead of the expected side by
side stereo "squircles"?
To be fair, I am putting these questions as much to myself, while hoping
those who know the ins & outs of VLC can provide some further insight.
On 2019-05-19 17:26, Lomax wrote:
> Hi Alexandre,
> Thank you for spending time on this - don't worry about my long
> rambling post from Friday; if there's any useful information there then
> great, but the questions are not so important. In answer to your
> questions re. usage, I'm working on a public exhibition at a "marine
> knowledge centre" in Sweden (a registered non-profit), where the staff
> are involved also in practical science - including diving to document,
> sample and survey the marine life and seabed conditions. They have a
> very nice stereoscopic 360 underwater camera (not sure of the model but
> can check if you want) but they have no means to show its footage to
> the museum visitors - which is where I come in: amongst other things*
> I've been tasked to build a permanent exhibit with which visitors can
> view this material. I chose, perhaps foolishly, to use the Oculus Rift
> CV1, thinking it was a well established product with plenty of support.
> Only afterwards did I find out that it was nothing of the sort, and
> that indeed the whole area of stereoscopic 360 video is severely
> underdeveloped. Only proprietary (Windows/MacOS) applications seem to
> be available, which would be unsuitable for use in an exhibition; no
> user interaction should be required to launch the video, nor should the
> computer, or any OS or application GUI elements be visible, and the
> system needs to be reliable - we just want the video to play in a
> continuous loop from when the computer is switched on until it's
> switched off, and nothing else. This, and the fact that it is open
> source, is why I went with VLC - thinking the command line interface
> should make it easy to get it to start playing on boot (and it was),
> and that we could tweak anything else we might want to adjust in the
> source if needed. So our application is basically as simple as could
> be; there's no interactivity, no audio, and no motion tracking beyond
> rotation, and the material is pretty "standard" (at least as far as any
> standards are established). But I've now spent 50hrs+ on this, and
> EUR1500+ on hardware, yet have nothing to show, which means I'll have
> some tough explaining to do...
> I've taken some quick snaps of what I see[1-4] - please let me know if
> there's any specific information I can provide which may help getting
> to the bottom(!) of this.
> *) For example, I've also built a remote controlled submerged camera
> out in the water, based on an Axis PTZ PoE camera, which can be
> controlled with an analogue joystick from inside a mock "submarine".
> This is 2D HD and works beautifully on a Pi3 connected to a curved 27"
> 1: https://i.imgur.com/WGzfcY4.jpg
> 2: https://i.imgur.com/bAy39Wy.jpg
> 3: https://i.imgur.com/jWTbBxn.jpg
> 4: https://i.imgur.com/SswgW8T.jpg
>> Hi, I'll answer to your bigger previous post later, but first, thank
>> you for your comprehensive testing of this branch and the report you
>> The code is extracting lenses properties from OpenHMD to match the
>> projection with the way the headset works. If there is no HMD, there
>> is no projection. The "circle-like" effect is a colour and pixel
>> transformation (using panotools parameters) to match the lenses used
>> by the headset, which is needed because of the low distance between
>> the eyes and the screen in the headset.
>> What is your setup, do you have screenshot of the issue ? This code
>> taylored for headset demos so there are probably a lot of different
>> hacks and mostly only headset playback is probably correctly handled
>> this version.
>> I can try to provide you a version with some fixes for the demo if you
>> want and if it's not too much time consuming, but I would like to know
>> the exact use case for your demo. You can ping me as "unidan" on irc
>> you need some help on the build itself.
>> Alexandre Janniaux,
More information about the vlc-devel