Categories
Uncategorised

Discussion Hub: 360° vs 180°

VR 360 VS VR 180

  1. Reduced Resolution: VR360 splits resolution across a larger field of view (FOV), effectively halving the resolution compared to VR180. This exacerbates the already low resolution of VR content, making it less tolerable for viewers..
  2. No Place to Hide Equipment: Unlike VR180, VR360 makes it difficult to conceal crew, equipment, or staging elements. This often leads to unnatural setups or compromised visuals.
  3. Uninteresting Rear Views: Much of the 360° FOV is often wasted, as there’s usually little or no action behind the camera. This leads to sections of the video being filler, detracting from overall engagement.
  4. Navigation Fatigue: Users often need to constantly scan around to find the interesting parts of the content in VR360. This can quickly become frustrating and detract from the viewing experience, especially given the limited FOV of most VR headsets (~90°).

Why Not just shoot 2D?

  1. Lack of Depth and Realism in 2D:
  • Breaks Immersion: VR thrives on immersion, and 2D fails to deliver the depth required for a convincing VR experience. Without stereoscopic 3D, objects appear flat, and the sense of scale is lost, which diminishes the emotional and physical engagement of the viewer.
  • Unnatural Scale: In 2D VR, the size and spatial relationship of objects feel “off.” The brain expects depth cues in a VR environment, and their absence disrupts the illusion, making the experience feel artificial.
  1. No True First-Person Perspective: In VR, users often seek experiences that mimic real-life perspectives. 2D doesn’t replicate how humans naturally perceive the world, making the experience feel more like watching a flat screen than being present in the moment.
  2. Immersive Experience: Better performance on expressive force.

Comparison of the user experience of two different 180 VR film formats:

Version1

In this format, the viewing distance is fixed. When zooming in or out near the edges of the image, significant distortion occurs, which can lead to visual discomfort. The edges remain visible at all times, and the 3D effect is relatively weak, resulting in a flatter image with reduced comfort and immersion.

Version2

From a distance, this format presents a protruding, roughly 45° “block-like” spatial effect. However, when viewers actively zoom in or move closer, they can effectively pass through a “portal” into the 3D space. This transition makes the boundary far less noticeable, prevents scene distortion, and offers a perceptible sense of entering a new spatial environment. As a result, the Version2 provides a significantly stronger feeling of immersion and is less likely to cause dizziness or motion sickness.

Compared to Version1, the second demonstrates a more immersive and comfortable experience, highlighting the importance of flexible viewing distances and portal-like transitions in immersive 180 VR film design.

Categories
Uncategorised

VR 180 Test

Test Shots (VR 180#1)

This is the first test using Unreal Engine to create a 180-degree video. (Resolution: 8192 X 4090)

I used custom camera rig to render the image sequences in PNG format at 24 fps and then compiled the video in DaVinci Resolve at 24 fps.

This was my first attempt at using Unreal Engine to render a 180° video and upload it to YouTube. The process was successful—the video is now viewable as a proper 180° experience both inside a VR headset and on YouTube’s web platform.

However, my main focus for this test was simply to complete the rendering and ensure the footage played correctly on YouTube. Because of that, I didn’t pay much attention to the stitching quality. As a result, the stereo effect wasn’t as strong as I had hoped, and after watching for a few minutes, I started feeling slightly dizzy. I suspect the issue causes from a misalignment between the left and right eye images, which likely disrupted the depth perception.

Test Shots (VR 180 #2)

This is the second test using Unreal Engine to create a 180-degree video with a different resolution compared with the previously one (Resolution: 6480 X 2160).

I used custom camera rig to render the image sequences in PNG format at 24 fps and then compiled the video in DaVinci Resolve at 24 fps.

In this experiment, the only change I made was adjusting the resolution from 8192 × 4096 to 6480 × 2160. This new resolution provides a more rectangular field of view, making the experience more comfortable for the audience.

For the next iteration, I plan to address the misalignment issue to improve the stereo effect and reduce discomfort.

Test Shots (VR 180 #3)

This is the third test using Unreal Engine to create a 180-degree video (Resolution: 6480 X 2160).

I used custom camera rig to render the image sequences in PNG format at 24 fps and then compiled the video in DaVinci Resolve at 24 fps.

In this experiment, I attempted to manually stitch the images for the right and left eyes.

However, the stereo effect still wasn’t as strong as I had hoped. I suspect this might be due to overlooking both vertical and horizontal disparity, which could be affecting the depth perception.

Test Shots (VR 180#4)

In this test, I inserted a title with a black background and an image into a 180° VR video while maintaining a stereo effect. The video was then encoded and exported in a side-by-side format.

Test Shots (VR 180#5)

Here’s a refined version of your text with improved clarity and flow:

In this test, I inserted a title with a black background and an image into a 180° VR video while maintaining a stereo effect. The video was then encoded and exported in a top-and-bottom format.

However, based on my observations, YouTube seems to provide the best support for the side-by-side format.

Categories
Uncategorised

VR 360 Test

https://miro.com/app/board/uXjVLsJis5o=/?moveToWidget=3458764613723908721&cot=14

Test Shots (VR 360 #1)

This is the first test using Unreal Engine to create a 360-degree video.

I used Unreal’s experimental Panoramic Capture Tool to render the image sequences in PNG format at 24 fps and then compiled the video in DaVinci Resolve at 90 fps.

This journey has been an exciting adventure, marking my first experiments with Unreal Engine to create 360-degree videos. During these trials, I successfully rendered 360-degree images.

However, the default stitching method in Unreal introduced some artifacts, and the entire video appeared slightly jittery when viewed in a headset. As a result, I decided to continue refining the process.

Potential Cause:

1. Unreal Default stitching method causes the artifacts. 

2. The inconsistency in frame rate settings between Unreal Engine’s render configuration and DaVinci Resolve’s compilation settings is causing the video to appear jittery.

Test Shots (VR 360 #2)

This is the second test using Unreal Engine to create a 360-degree video.

I used unreal plugin 360 to render the image sequences in PNG format at 24 fps and stitched them together into panoramic images.

Last, I compiled the video in DaVinci Resolve at 90 fps.

Side By Side Comparison (FFmpeg Stitch)

With Volumetric Fog, Bloom, Light Shafts, and Vignette Turned On (Figure #1)

With Volumetric Fog, Bloom, Light Shafts, and Vignette Turned Off (Figure #2)

As shown in the images above, the FFmpeg stitching method performed exceptionally well with Volumetric Fog, Bloom, Light Shafts, and Vignette turned off. These elements can introduce unpleasant stitching artifacts, as highlighted in Figure 1. (While these artifacts may not be very noticeable in the image (Panoramic) format, they can appear severe when viewed inside a VR headset.)

Therefore, for the rest test renders, I have disabled Volumetric Fog, Bloom, Light Shafts, and Vignette settings and added them back at the post stage.

Potential Cause:

1. The inconsistency in frame rate settings between Unreal Engine’s render configuration and DaVinci Resolve’s compilation settings is causing the video to appear jittery.

Test Shots (VR 360 #3)

This is the Third test using Unreal Engine to create a 360-degree video.

I used unreal plugin 360 to render the image sequences in PNG format at 30 fps and stitched them together into panoramic images.

Last, I compiled the video in DaVinci Resolve at 30 fps.

In this test render, the earlier shattering effect has become less noticeable. However, the default stitching method in Unreal still introduces some unwanted artifacts in the final output video.

As a result, I decided to continue refining the process.

Potential Cause:

1. Unreal Default stitching method causes the artifacts. 

Test Shots (VR 360 #4)

This is the fourth test using Unreal Engine to create a 360-degree video.

I used unreal plugin 360 to render the image sequences in PNG format at 30 fps and stitched them together into panoramic images.

Last, I compiled the video in DaVinci Resolve at 30 fps.

In this test render, the earlier shattering effect has become less noticeable.

By switching from Unreal’s default stitching method to the FFmpeg stitching method, I was able to dramatically reduce the unwanted artifacts in the final output video.

As a result, I have decided to use this method for the final project output.

All Resolved

Categories
Uncategorised

Timeline Of the Great Fire #1