Back to Blog
Raspbian mouseless6/6/2023 ![]() It can sometimes be useful or artistically fun to do so, but that's a UX choice. There's no need to emulate physical paper, or pens, or rubber erasers, or cut-and-paste with an xacto knife, paper fragments, and gooey paste from a jar. Delete a word, and it "goes someplace, because it can't just vanish"? That would be silly. When you use office apps, there's little attempt to maintain physical-world properties like say object permanence. Nor, with non-novice users, need any other physical correlation be maintained if it's useful to ignore it. And similarly, there's no reason for your hands to be positioned at the end of your arms (as with "throwy hands"). Touchpads and cursors have similar non-trivial relationships. When an artist draws with a display-less graphics tablet, the hand is in one place, but they watch and use an abstraction of it on a display, displaced in space, orientation, and scale. > When there's a good solution to this I suspect quite a few developers will give it a go when they want a fully immersive in-the-zone experience.Īnd shallow 3D seems to have interesting possibilities for IDEs. It's been Windows-only, stalled, kind of crufty, and now has finally been acquired, with uncertain consequences for future availability. The sole hand-tracking survivor was Leap Motion, which fumbled its couple of Apple acquisition attempts. It's a poster child for acquisitions wiping out market infrastructure. Though I've heard with a SteamVR stack, pass-through latency can be a challenge. With a browser-as-compositor stack, it's two-video-dom-elements trivial. I've been using my unremarkable laptop keyboard since Vive, by doing pass-through AR with a camera gaff-taped to the HMD. > the critical show-stopper is not being able to use your hands to type onto a real keyboard Nreal's 1080p AR glasses could be available in a couple of months. :/ And reading glasses, because DJI's lenses. With shutter glasses, because DJI didn't expose the two panel feeds. Readable 5 pt fonts, almost corner to corner. If you disable the "oh, its an HMD, not a normal display" feature the OP speaks of, you can easily throw up a pixel-aligned test pattern image. And it gets even worse as objects move in from infinity and thus out of these regions. But I find each eye gets a central circular region something vaguely like 300 px diameter where subpixel rendering is worthwhile (I've a custom render stack), and less than 600 px before the lens blurs pixels together. Is it? Are the lenses that much better? My Lenovo Explorer WMR is 1440^2 px/eye, and thus has a similar angular resolution to the wider FoV Index. It may not feature a physical computer, but it can be a time saver, and a bit of a game changer in some scenarios.> The display quality of the new Index (for instance) is sufficient. ![]() Raspberry Pi fans love to play and tinker. Virtualization is just another way of looking at things. ![]() It's also good practice to test a new operating system in a virtualized environment. Making screenshots on the Raspberry Pi is simple enough but exporting them can be tricky-virtualization circumvents that. This might be useful to children using Scratch or other development tools. A virtual Raspberry Pi offers the chance to gauge how the various apps will run.Further, virtualization gives anyone wanting to dip a toe in the pie (!) a quick chance to do so. All the messing around that is involved with writing a disk image to SD is avoided. Using a virtualized Raspberry Pi environment lets you try out the operating system with little effort.
0 Comments
Read More
Leave a Reply. |