- cross-posted to:
- datahoarder@lemmy.ml
- cross-posted to:
- datahoarder@lemmy.ml
These things (and Seagate’s) have the usb interface soldered on, so if the drivd dies, forget about the data, no way to connect to another usb adapter to try to recover. Granted, it’s usually the drive that dies, but in these cases, you have a 100% rate of non recovery . Any other brand’s are standard drives. My favorite are toshiba.
Why would the USB electronics be particularly likely to fail relative to other electronics on the drive?
Because you flex and replug the interface often.
The thing you use to plug your phone, tablet, drives and other things with is very often the failure point unless you break screens or get water in them.
Normally you simply have a HDD drive with a SATA interface in there, so if the USB connector fails, you can still easily recover your data.
With these things, you’re lucky if they even offer the possibility of repairing or recovering the drive.
In my experience the drive fails more often than the adapter, but they do fail. Also, there is a good chance to recover data from a failed drive. With a soldered adaptor it’s basically impossible. The worst part is that the externals are often used for backups.
Because that’s usually the cheapest part that manufacturers can get away with cheapening iut further.
Solder joints
Not particularly, but it happens.
I solder new usb connectors and all manner of other connectors on to stuff all the time.
I’m at a 100% success rate getting data off stuff that just needs new connectors.
If you need data recovered, the literal best case scenario is that it’s just got a bad connector.
Soldering is not the problem, unless its smd or tiny, its getting a non standard usb interface.
you mean in the case of a dead USB ic or something or do you mean the USB port isnt standard?
Just don’t look at the failure rates
OMG is it bad. We used a couple WD drives for a surveillance camera array and they didn’t last a year. Two drives failed 9 months apart. Ended up going on Blackblaze and picking what looked best for our XFS Raid 10 having learned that lesson the hard way.
Backblaze publish drive fail report
Yeah our company learned the hard way when they bought out G-DRIVE. Got a line failure on 4x 20TB drives.
Switched back to LaCie and Glyph.
I have a dual NVMe USB3 caddy that’s smaller than most 2.5 HDD housings with currently 2 2TB drives, you can buy 4 and 8TB nvme drives these days too. I can throw that thing out a car and it won’t care.
And the drives are easily swappable and so are the electronics in the casing.
So no, 2.5" HDD’s still are an utterly dead end of technology.
Especially with these and some other vendors, the USB interface is part of the drive (there’s no SATA port on them), so you can’t swap them or take them out for data recovery. They are HDD tech, which doesn’t do shocks or any other sort of roughhousing, they are slow as shit and use far more power than any NVMe drive.
Which NVMe USB3 caddy are you using? I’d like to get me one.
Looks like this one except that it is sealed on one end and the caddies for the two drives have a cover plate that screws in over a gasket and rubber ring.
I got it in a shop in Hong Kong when I was there for a convention earlier this year. No idea if you can find it online, maybe somewhere like Alibaba.
From the article:
UPDATE 5/17, 6 PM: Western Digital has confirmed that the new 2.5-inch T GB HDDs uses 6 SMR platters
SMR = shingled magnetic recording https://en.m.wikipedia.org/wiki/Shingled_magnetic_recording - “continuous writing of large amount of data is noticeably slower than with CMR drives”
They’re external, you’re not going to be using them for performance anyway.
Yes, I should’ve added - whether the write speed matters depends on your own use case.
For my SMR drive, it’s taking roughly 2GB of backup files every few hours, in the background, and there’s plenty of empty space on the drive. In my case, it doesn’t matter at all.
However, if you’re sat at your computer, frequently transferring large files while the drive is at least half full, and you have to wait for completion… Then it’ll matter.
True, but to a point. Being external, it’d be something I plug in occasionally to back up large project files. I don’t technically need blazing speeds but I’d still be displeased if my transfers took 10 minutes or more.
Bought some of the old versions for backup drives. That was a mistake.
why?
Very high failure rate. even sony 2.5’s have a similar rate of death. For some reason this form factor is just terrible for longevity
Bingo. Sorry, had typed a reply about my failure rate and difficulties getting an RMA but forgot to submit.
My bet is on density. You cram so much in such a tiny space, so any tiny imperfection or fault will corrupt the data or render the drive unusable.
At the time it was fine. I had an array of 4tb drives that I was backing up with a series of 5gb drives. They were just so unreliable; all but one failed while the array they backed up is still spinning strong.
Not exactly reliable and less than easy rma process.
Sorry, had typed this and forgotten to hit submit :(
I paid around $300 for one of the first 2TB drives. Surprisingly it hasn’t come that far
I’ve got the 5TB version of this drive as a backup for my gaming laptop. Haven’t had any problems with it.
What does one need 6TB of storage for?
Videography
Photography
Downloading Machine Learning Models
Data for Training ML Models
Training ML Models
Gaming (the games themselves or saving replays)
Backing up movies/videos/images etc.
Backing up music
NASTake your pick, feel free to mix and match or add on to the list.
And music production, that takes a tonne of space.
Porn
My GOG and Bandcamp libraries.
My mate has 120TB on his NAS and it’s about half full. He’s got programs that automatically download music, movies, shows, and more as soon as they’re released.
Large capacity drives are good for backups, especially if you’re backing a lot of media, such as a DVD/Bluray collection.
Some people actually use their computer.
Scientific workloads often involve very large datasets. It might be high resolution data captured from various sensors, or it might be more “normal” data but in huge quantities. Assuming the data itself is high quality, larger datasets mean more accurate conclusions.