Jump to content

landrvr1

Members
  • Posts

    369
  • Joined

1 Follower

Personal Information

  • Country
    United States

landrvr1's Achievements

Newbie

Newbie (1/14)

10

Reputation

  1. Hmm, sorry to hear that. We've used them on countless client engagements and no one has complained. It's a trade-off, I guess. I'm 99% after great optics, with comfort a secondary concern - only because the viewing time with these is really very short. People are spending 10min or less - usually a lot less, exploring the spaces we create. Optics on most others we've tried are pure garbage. Usually way too much curvature in the bi-convex lenses.
  2. We've standardized on PowisVR. They are the best quality of the dozens we've tested, and full branding is no problem. They are the only cardboard viewers I'm aware of that offer depth adjustable, threaded lenses as well as adjustable IPD. https://powisvr.com/
  3. So I really haven't seen much info on this but thought I'd share some testing results on this two platforms. Vive wins hands down for one very important reason; which is the last item on the list, heh. Optics In my estimation the Rift has better optics. The Vive has some issues and this is very apparent in one's periphery vision. Keeping your head still, glancing to the left and right reveals slight blurring/warping. The Rift's periphery vision is near perfect. We had many people of all ages and vision types test it out, and every single one noticed the problem. This isn't to say that it's horrible, just there... Hand Controls The Rift Touch hand controls are better in my view. More lightweight, the form and function allow you to actually do hand and finger gestures - impossible with the Vive. Very nicely done. Tracking, Sensors, and Users The Vive is significantly better and more accurate. The major problem with the Rift is that the sensors and entire setup process isn't great. This translates into a real deal-breaker for us: you will never get an accurate height reading - which means that any sort of presentation that involves furniture or an emphasis on accurate human interaction/scale is horrible with the Rift. No matter what I tried, I was always floating at least 2" above the floor. It's easy to test, just place the hand controllers on the real floor and see them floating above the virtual floor. In the Rift setup process, there's two red flags that spell trouble: 1. You are asked for your height. 2. You never have to place the hand controllers on the floor like you do with the Vive setup. Asking for the user's height means that the entire setup is really geared for an INDIVIDUAL user and NOT a group experience. By never placing the hand controllers on the floor, the entire system never has an accurate reading on exactly where the floor is within Z space, lol. Horrible. That means that if you have a 6ft tall user, the setup has to be for that person. If someone 5'-6" high then tries out the headset, it's way off. This is only compounded by the floor height issue. The Vive was flawless every time, and totally designed for different user at any time. The sensors are better for the Vive, and thus tracking is better.
  4. I disagree on a 'big difference' in performance with the GearVR. It has somewhat more accurate tracking with the IMU, but it's barely noticeable on fixed point pano experiences if at all. With gaming apps I would agree that the built in IMU helps a lot. I also wouldn't agree on the app vs web experience. Web based like KRPano loads everything you need on a per scene basis. Once loaded, the FPS, tracking, etc isn't any better or worse. It's pretty much the same. You are far more at the mercy of your device processor than whether or not it's web or app based. The issue with web based is that if you have a dodgy internet connection, sometimes sound files and/or hotspots might not load properly. This rarely happens during presentations because we do a quick run through. When the clients take the viewers with them, well, that's out of our control, lol.
  5. KRPano is my platform of choice. Once you get past the initial (slight) pain of building and tweaking the XML, it's super easy because you've got a template you can always start out with. It's the only platform that allows sound (ambient and point-of-interest, both in spatial 3D!), and the customization in terms of menu and hotspot creation is fantastic. No app based platform such as IrisVR or Yulio is going to give you as enriching of an experience. At least not yet. It all depends on what you are after: a quick look-see client review or as immersive of an experience as possible.
  6. I'm not explaining myself properly: The most common workflow for realtime archviz VR is going to be: 1. Create your environment in Max/V-Ray like we've always done for any architectural project. This includes all the wonderful shaders, lighting, and techniques we've come to know and love in order to deliver photorealistic renderings and animated films. 2. Take those building blocks and convert it for use in Unity/Unreal/Stingray/etc. That is far from an easy workflow and is, quite frankly (especially for Unity and Unreal), an enormous amount of work with endless trial and error experimentation. No one has that kind of time in the real world of archviz deadlines. From what I've seen, AutoDesk is the only entity who's made even the smallest of attempts at making the workflow I described somewhat easier. They may fancy Stingray being a game engine to create games but we know that AutoDesk really has their eye on realtime VR for the archviz world. If you are creating nothing but realtime VR experiences for archviz - and never have to bother with V-Ray renderings/films - it MIGHT be possible to get a fast workflow going with Unity. That's not how the business is being handled in every architecture and design firm I've been in contact with... Realtime VR isn't replacing photoreal deliverables; it's being used in conjunction with them. Hope that helps!
  7. Great thread! Some comments: Google Cardboard I've tested a dozen or more and all of them are complete garbage except for one: the PowisVR. First off, most cardboard units have way too much curvature in the biconvex lenses and b. place those lenses way too close to the device. One or both is a recipe for a crap experience. The PowisVR has an appropriate curvature, and each lens is threaded to allow for adjustment. It's the only cardboard style unit that offers that feature. GearVR Pretty overrated. That's not to say it's junk, far from it. However, there is absolutely zero difference in the overall optic quality vs the PowisVR unit. They aren't using any better glass technology. Does the experience offer more immersion because of less light leaks? Yes, but not dramatically so... With GearVR and cardboard, you are still at the mercy of the resolution of your phone. In fact, if you are using a browser based experience, the browser will further reduce the resolution somewhat. There's no way that the pricetag for GearVR, not to mention the fact that you have to use Samsung phones, justifies our use of the product in widespread client engagements for fixed point rendering panos. Also, and a lot of folks don't understand this, you are never going to do realtime VR with GearVR. It's fixed point rendering VR, only. The phones have a long way to go before they could handle any serious realtime work. Oculus Rift Great unit. The Vive is better. HTC Vive Not much to say there other than it's the way to go. There's a zillion reasons why, but I'll point out that countless hardware manufacturers and architecture firms have adopted the Vive platform over Oculus. Oculus = consumer market. Vive = seriousness. heh. Hololens Not ready for primetime by a longshot. The limited field of view is pretty horrible. While you can view any FBX file, only Unity is supported to create anything interactive. It's the future, for sure, but right now the promise outweighs the delivery. Maybe the next version... Customer/Client Experience There's absolutely nothing like giving branded PowisVR viewers away to the client after the presentation. The 'take it with you' or 'leave behind' approach is priceless. They love it. Their co-workers back at the office love it. Their kids and friends love it when they take it home. We've learned very, very quickly that the overall experience and...I'll say it... 'fun factor' cannot be underestimated. We tell some great stories with our virtual tours using KRPANO, but equally as important to those stories is the fun the clients have. Can you put a pricetag on that? No you cannot. hah. I'll say this as well, because it's worth mentioning: Of the dozens of clients we've presented to with virtual tours, easily less than 1% of ever experienced VR or AR. This means that we are their first introduction, and that's the pretty awesome thing - especially when it goes well. 5 years from now everyone and their grandmother will have VR and AR but, for now, it's still magic to pretty much everyone. Multiformat Approach I tell folks a lot that it's important to keep in mind that the Rift and Vive aren't just for photorealistic real time experiences. Far, far from it. There's half a dozen ways to get a realtime experience going in both Revit and SketchUp. We need to set aside our innate need for photorealism and recognize that realtime VR as a presentation tool tells great design stories that don't need realism. Spacial relationships, content, density of space, etc etc are all fantastic to present with realtime VR. We will go from a beautifully rendered fixed point VR experience with KRPANO and cardboard viewers, to a Sketchup realtime VR tour during the exact same presentation. It doesn't have to be one or the other. Game Engine Madness We're adopting Stingray because the workflow from Max/Vray to Unity or Unreal is complete garbage for anyone that deals with serious real world deadlines. We don't have months to delicately pamper FBX files and Unity materials in order to show off yet another condo loft VR example. We have a matter of weeks to build full office floor experiences. We have a ways to go until we are comfortable enough to put these interactive VR presentations in front of a client, but it's very very clear to us that AutoDesk is serious about supporting the architecture and interiors world. Unity could care less about how to get something from Max/Vray to their platform. AutoDesk's integration and live link between Max and Stingray is huge. Purists from Unity and Unreal camps hate Stingray. Honestly, it doesn't look as good. Yet. While Unity and Unreal continue to ignore the crushing deadline demands of the realtime archviz world, AutoDesk is right out front providing tutorial after tutorial for great workflow ideas. As I mentioned yesterday in a thread over at Chaos, at the end of the day - given our project schedule demands - I'm going to gravitate towards whatever company and platform wants to make my life easier.
  8. Don't think embedding is possible. Regarding your original post, looks cool. But listen, as cool as I'm sure it is quite frankly the problem isn't creating interactivity...the problem is the nightmare workflow from Max/Vray to Unity.
  9. We are not cool anymore. It's all about Revit, baby!!!!! heh
  10. You're absolutely positive that the material slot in the editor is the same material? And that there's nothing in the reflection slots that you forgot about? Have you applied a different material to those objects (like a flat white) and then rendered? What does it look like then? After that, reapply your preferred material. Is there still a problem?
  11. That's great. So..uh....care to elaborate a bit on your workflow from VRay to UE4?
  12. It's actually a bug that Chaos is aware of..... I started a thread weeks and weeks ago on this topic, and Vlado confirmed the issue but set the priority as 'low' for fixing, lol.
×
×
  • Create New...