• 6 Posts
  • 1.05K Comments
Joined 2 years ago
cake
Cake day: October 4th, 2023

help-circle

  • https://en.wikipedia.org/wiki/M-DISC

    M-DISC’s design is intended to provide archival media longevity.[3][4] M-Disc claims that properly stored M-DISC DVD recordings will last up to 1000 years.[5] The M-DISC DVD looks like a standard disc, except it is almost transparent with later DVD and BD-R M-Disks having standard and inkjet printable labels.

    In 2022, the NIST Interagency Report NIST IR 8387[25] listed the M-Disc as an acceptable archival format rated for 100+ years, citing the aforementioned 2009 and 2012 tests by the US Department of Defense and French National Laboratory of Metrology and Testing as sources.

    That being said, that’s 100GB a disc. You can stuff a lot more on a typical hard drive, and I appreciate that people want to easily and inexpensively reliably store very large amounts of data for the long term.

    EDIT: At least in a quick search on Amazon, while there are plenty of drives rated for M-DISC, I don’t see any kind of “take hundreds of discs, feed them mechanically in and out of a drive” device that’d let one archive very large amounts of data automatically. You’d need 100 of those to fully archive a 10TB hard drive.


  • I haven’t been looking recently, but I assume that most image hosting services have been stripping EXIF metadata, or at least some of it, for years. Imgur strips it; it was used for image hosting for Reddit for a long time.

    On lemmy, pict-rs strips EXIF metadata. It’s a real annoyance on !imageai@sh.itjust.works, because the AI image generators I’ve seen attach metadata to indicate that:

    • The image was generated via AI

    • Prompt keywords used to generate the image, if using something like Automatic1111.

    • In the case of ComfyUI, the entire workflow, so that someone can go produce the entire workflow that led to the image.

    I’d kind of prefer that there be some software that try to identify personally-identifiable data and have pict-rs run that and only remove that. Or, alternately, let the user opt in to not stripping EXIF metadata.




  • LG TVs will soon leverage an AI model built for showing advertisements that more closely align with viewers’ personal beliefs and emotions. The company plans to incorporate a partner company’s AI tech into its TV software in order to interpret psychological factors impacting a viewer, such as personal interests, personality traits, and lifestyle choices. The aim is to show LG webOS users ads that will emotionally impact them.

    “As viewers engage with content, ZenVision’s understanding of a consumer grows deeper, and our… segmentation continually evolves to optimize predictions,” the ZenVision website says.

    Going beyond ads, if you start training AIs on human preference based on mass-harvested emotional data, I imagine that you can optimize output quite considerably. Like, say I have facial recognition being converted to emotional response data, maybe something like smartwatch pulse data, some other stuff, and I go train an AI to try to produce a given emotional output in a viewer. I bet that they can do a pretty good job of that. Like, maybe how to piss people off at a target in political campaigns, build an AI that has a potent ability to emotionally-manipulate and flirt with humans, or ensure that interest doesn’t waver in television content by determining at what points people have less interest.



  • There’s not really enough to give an conclusive answer from “it’s not reachable”. All I can do is tell you what I’d probably do to try to troubleshoot further.

    My first steps in troubleshooting connectivity would probably be something like this:

    • Fire up something on the HTTP server (I’m assuming it’s running Linux) like sudo tcpdump port 80. That should let you see any packets that are reaching the HTTP server.

    • From a Linux machine on an outside network — a tethered cell phone might make a reasonable machine, if you don’t have another machine you control out there somewhere in the ether — running something like mtr --tcp -P 80 <hostname>. That’ll tell you, at an IP-hop-by-IP-hop level, whether there’s anything obstructing reaching the machine. Could be that your ISP blocks 80 inbound, for example.

    • So the next step is probably to see whether you can get regular ol’ HTTP through. Also from an outside network, running curl --verbose http://<hostname>/. That’ll let you see what’s happening at the HTTP level.

    I’m guessing that you’re probably going to have something along here break. It could be that the packets are being blackholed at a hop prior reaching your router, in which case your ISP may be firewalling inbound on that port. It may be that they’re reaching your router, but that your router is trying to forward to the wrong machine. It may be that you have some kind of firewall on the HTTP server that’s blocking connections that aren’t coming locally or from the WireGuard side. But at least it’ll probably give you a better idea as to how far it’s getting.

    Once you’ve got that up and running, can look at HTTPS:

    • If this is working, if you want to test the TLS certificate handshaking, see if there are any issues, again, from an outside network: openssl s_client -connect <hostname>:443 -prexit. That’ll let you see the TLS handshake and any issues that happen during it.

    • Also from an outside network, running curl --verbose https://<hostname>/. That’ll let you see what’s happening at the HTTPS level.

    EDIT: Oh, yeah, and someone else pointing out confirming that the DNS resolution is what you expect is probably also a good first step. host <hostname> from an outside network.


  • That being said from what I’ve heard some of the newer high DPI devices handle this a lot better.

    I mean, you can get higher-resolution ones, but they aren’t as high resolution as even the monitors that you’d virtualize. Like:

    • First, the guy is using glasses that are really designed for augmented reality, not as a monitor replacement. They’re not optimizing for this use case.

    • We aren’t yet at the point where traditional displays are really even maxed out in terms of usable resolution, and as things stand, HMDs have lower resolution.

    • If someone wants that “virtual projection” thing, HMDs have to be even higher-resolution than that.

    One good thing about these AR goggles compared to trying to use VR goggles for this is that the AR goggles are spending the physical pixels they do display in the center of your visual field, as opposed to way off in the periphery. VR goggles need to have a really high field of view to provide immersiveness and let you see things in the corner of your eye, but for working with text and such, monitor replacement hardware only really needs to put something in the visual arc that you’d actually be viewing a regular monitor in, in the center of your field of vision, a smaller arc. So repurposing these for a desktop replacement is at least using the pixels that are physically-displayed more-efficiently than VR goggles would. That is, the XReal goggles here are at least closer to being optimized to be a “monitor replacement” HMD than VR goggles would be.


  • I think that some of the issue here is that the theoretical use case that these are designed around is not what the author is trying to use them for.

    The author is looking for a monitor replacement.

    These are augmented reality goggles. Like, the hardware is optimized to look at the world around yourself and then display useful information annotated over it, for which resolution is not critical. If we had data sources and software for that, that might be useful too, but right now, we don’t really have that software library and data sources.

    I think that Snow Crash did a good job of highlighting some of the neat potential of and yet also issues with AR:

    Putting a rubber-band on brightness:

    A black car, alive with nasty lights, whines past her the other way, closing in on the hapless Hiro Protagonist. Her RadiKS Knight Vision goggles darken strategically to cut the noxious glaring of same, her pupils feel safe to remain wide open, scanning the road for signs of movement.

    Highlighting hazards in low-light conditions using sensor fusion can be useful (current high-end US military NVGs do some of this):

    He turns off his view of the Metaverse entirely, making the goggles totally transparent.  Then he switches his system into full gargoyle mode: enhanced visible light with false-color infrared, plus millimeter-wave radar.  His view of the world goes into grainy black and white, much brighter than it was before.  Here and there, certain objects glow fuzzily in pink or red.  This comes from the infrared, and it means that these things are warm or hot; people are pink, engines and fires are red.

    The millimeter-wave radar stuff is superimposed much more cleanly and crisply in neon green.  Anything made of metal shows up.  Hiro is now navigating down a grainy, charcoal-gray avenue of water lined with grainy, light gray pontoon bridges tied up to crisp neon-green barges and ships that glow reddishly from place to place, wherever they are generating heat, It’s not pretty.  In fact, it’s so ugly that it probably explains why gargoyles are, in general, so socially retarded.  But it’s a lot more useful than the charcoal-on-ebony view he had before.

    And it saves his life.  As he’s buzzing down a curving, narrow canal, a narrow green parabola appears hanging across the water in front of him, suddenly rising out of the water and snapping into a perfectly straight line at neck level.  It’s a piece of piano wire.  Hiro ducks under it, waves to the young Chinese men who set the booby trap, and keeps going.

    The radar picks out three fuzzy pink individuals holding Chinese AK47s standing by the side of the channel.  Hiro cuts into a side channel and avoids them.

    Overlaying blueprint data can permit “seeing through walls”:

    YOU ARE HERE," he says.  His view of the Enterprise’s hull – a gently curved expanse of gray steel – turns into a three-dimensional wire frame drawing, showing him all the guts of the ship on the other side. Down here along the waterline, the Enterprise has a belt of thick antitorpedo armor.  It’s not too promising.  Farther up, the armor is thinner, and there’s good stuff on the other side of it, actual rooms instead of fuel tanks or ammunition holds.

    Hiro chooses a room marked WARDROOM and opens fire.  The hull of the Enterprise is surprisingly tough.  Reason doesn’t just blow a crater straight through; it takes a few moments for the burst to penetrate.  And then all it does is make a hole about six inches across.

    A lot of the obvious stuff that one might display in AR goggles doesn’t compete well with just showing reality in terms of usefuless:

    He stumbles forward helplessly as something terrible happens to his back.  It feels like being massaged with a hundred ballpeen hammers.  At the same time, a yellow sputtering light overrides the loglo.  A screaming red display flashes up on the goggles informing him that the millimeter-wave radar has noticed a stream of bullets headed in his direction and would you like to know where they came from, sir?

    Hiro has just been shot in the back with a burst of machine-gun fire.  All of the bullets have slapped into his vest and dropped to the floor, but in doing so they have cracked about half of the ribs on that side of his body and bruised a few internal organs.  He turns around, which hurts. The Enforcer has given up on bullets and whipped out another weapon.  It says so right on Hiro’s goggles: PACIFIC ENFORCEMENT HARDWARE, INC. MODEL SX-29 RESTRAINT PROJECTION DEVICE (LOOGIE GUN).

    He turns off all of the techno-shit in his goggles. All it does is confuse him; he stands there reading statistics about his own death even as it’s happening to him. Very post-modern. Time to get immersed in Reality, like all the people around him.


  • Additionally, the virtual screen was not fixed in space but moved around when you moved your head, which gave me vertigo after prolonged use.

    The current version of these glasses have this optional device that they sell that provides this fixed-in-real-space projected screen called a Beam – I assume that it’s got enough 3D hardware and such to do the projection.

    The problem, as I mention in another comment, is that if you do any kind of 3D projection of a virtual monitor, you have to “spend” resolution from the physical monitor on it to get the virtual monitor enough lower-resolution that it still looks good, and I don’t want to give up the resolution.

    Like, there are physically 1080p, 1920x1080 OLED displays in front of each eye on these.

    My laptop monitor, right now, is 2560x1600. So even from the start, I spend resolution just to get down to the resolution of the displays in the physical HMD.

    Then I’m projecting a virtual monitor on that. You could argue what a reasonable virtual-to-physical ratio is, but it’s gotta be less than 1.

    The virtual display might be big in terms of visual arc, use a lot of my optical receptors. But end of the day, I want to shovel a lot of data into those optical receptors.

    Maybe if someone has really blurry vision or something like that, can’t see at anything like the kind of laptop screen resolution that I’m describing, it’d be less of an issue. But I’m not there (yet!).

    EDIT:

    The screen was too dark in bright rooms

    At least one of the current models that XReal has out has three levels of cycleable opacity on the display – IIRC it’s a “premium” feature on the high-end model, with a lower-end model that can’t do variable opacity. IIRC there’s a button on the body of the glasses or something. I don’t know if the specific ones that that guy tested was the this model, but if not, they do make a model that can.


  • It might be able to do that.

    From memory, those Xreal glasses have this optional doohickey called a Beam that you can plug them into. If you have that, it can “project” a monitor into reality, not have it move with your head. So they can do the projection bit. Like, they aren’t just dumb HMDs that throw an image in front of your eyes. They’re AR, so like VR goggles, they do headtracking and such, but they’re intended to have you view a mix of the real world and the virtual projected elements.

    The problem is that if you’re rendering a virtual image of a screen on a screen, you need to have the physical screen be significantly-higher-resolution to look right — you have to throw away some of your resolution on this. True of VR or AR googles. I’d think that the first practical monitor replacement HMD is gonna avoid doing any 3D projection of virtual monitors.

    EDIT: Yeah, those goggles can do it:

    https://www.reddit.com/r/Xreal/comments/182wwxb/can_i_use_3_virtual_monitors_and_2_physical/

    I do this with 4. I have 3 floating ones above and then look at the regular monitor thru the lenses. I also do this when watching TV and working in bed. I rest my head against the headboard looking up at the floating windows and straightforward when looking at the tv.


  • However glasses, a mini PC, keyboard and battery is smaller than a laptop.

    It relaxes the X and Y dimensions — no screen. But it might take up more volume, depending upon the configuration. Like, laptops use flat keycaps and scissor switch keys to save space. He’s using a keyboard with traditional, full-height keys and regular keyswitches. That alone is a not insignificant amount of volume.

    EDIT: I have a split-ergo keyboard with standard full-size keyswitches that fits into a folding case, one half on each side. If I were going to carry an external mechanical key keyboard with a portable PC, that’s probably what I’d use. Split ergo keyboards are expensive, though, so not as cost-effective as a standard one-piece mechanical key keyboard. !ergomechkeyboards@lemmy.world


  • Much to my surprise, it didn’t take long at all to get used to working while wearing AR glasses.

    Could you see yourself spending a full day working with smart glasses instead of using a monitor?

    For me at least, that “HMD all day” is the limiting factor. I don’t want to wear an HMD all day. My experience has been that they’re sensitive to being slightly misaligned and going blurry. Traditional displays are nice and crisp.

    I think that to be something that I’d want to use, the thing would need to do something like mechanically move the displays or optics internal to the HMD to keep it at a very precise, calibrated position relative to my eyeball, so that I don’t need to futz with not having my movements slightly misalign the HMD.

    In 2025, we don’t have an HMD that can do that.

    EDIT: Also, this doesn’t matter much if you’re watching a movie or something. Not visible then. But it’s a visible issue if you’re working with text or the like, if you want to full-on replace your display.


  • It’s longer than a laptop, but honestly, I have my laptop set up to hibernate if I have it closed for more than ten minutes or so, and it takes several seconds to get that dehibernated, even off NVMe, and some of that is happening in parallel. My last laptop was a lot slower, took something north of ten seconds to get dehibernated. He’s gotta drop a keyboard on his desk, unzip his HMD case, and plug each in (if he’s not using a wireless keyboard or the wireless accessory for that HMD, neither of which I would personally use). Some of that at least can be parallelized. And that HMD has integrated headphones, IIRC — I carry headphones with me for my laptop, so he doesn’t need to do that bit.

    EDIT: Oh, and his trackball/trackpad/mouse or whatever. I carry a trackball with my laptop, but don’t usually use it.




  • While I’ve also been interested in similar such systems, the author can accomplish one of his goals — the mechanical keyboard one — with a fairly-traditional laptop setup: he needs one of those hybrid laptops that has a screen that can swivel to act as a tablet. Then he just converts it to “tablet” mode and uses it as a monitor, without the keyboard sticking out at him, and he can use whatever keyboard he wants without the laptop keyboard being in the way. Does limit the laptop hardware options, though.

    And doesn’t buy him the other stuff that he’s gunning for, like more customizable hardware or a screen with a larger FOV or such.