

mnt/hgfs/NAScopy/tree.Click to expand.It was me who was asked the def12 question whether it was Omnisphere legitimate or cracked because the update is indicated in green which means that it was made online so in an official way and for clone as well. “/mnt/hgfs/NAScopy/" #įILE=/path/to/file/with/filenames/exported/from/RLinux.txt #i.e. ”/dev/sdb4:/"ĭestinationpath=“/insert/path/of/destionation/HD/” # i.e. Sourcepath=“/insert/path/of/WD/NAS/” # i.e. It’s a Bash shell script, you need to save this in a text file, add a. Hope this helps at least one person out there! Remember this will copy each file one by one, so it can take a while if you are moving several TB, like I was. You should create a small test file with only a few entries to make sure it works. You have to specify the path to the WD NAS, the path where you want to copy the files to, and the path where the file containing all the entries exported from R-Linux is located. If someone bricks a WD NAS, finds my answer, and is programmer, maybe they can add that portion to the script, so there’s no dependency on R-Linux and someone else who runs into this issue can simply run one script after installing the e2tools. Without this text file, the script won’t work. Like I mentioned, I also used R-Linux to export the file and directory listing into a text file. The script is a bit rough around the edges (I was learning on the go), but it got the job done. It took longer than it should have, but in the end I was able to write a script that would go through my list of files (that I exported from R-Linux) and move each file (and create the folder structure at the same time). I only needed to create a few directories and copy a few files, how hard could that be? Turns out that my directories had complex names, some containing spaces, and string manipulation is a bit weird in shell script. Instead of using Perl or Python (like a normal programmer would), I decided to create a BASH shell script (note- I’m not a shell script programmer).

Great, at least I could use that and I wouldn’t have to do that in the script. After going back to R-Linux for one more attempt, I noticed that it allowed to export the whole directory structure into a text file.
#LAGENTESOFT BATCHMOD FULL#
These tools could only copy individual files, not full directories (at least I couldn’t find a way). I tried using these tools to copy a directory via the command line, but after many failed attempts I gave up. I Found this thread, installed e2tools, and realized that now I could copy files using the terminal. Since R-Linux only presents a GUI, and I couldn’t mount the drive via the terminal this was the only way to go. I had too many subdirectories and I had to go one by one. After a while, I realized this was not working. As a last resort, I started recovering my files folder by folder (this HD had a complex directory structure). I could not recover any folder that contained sub-folders, R-Linux would not allow it because the folders had the wrong permissions. The problem is that folders themselves were unreadable when I recovered them. No big issue, since I could manually change the permissions (I used an app called BatChmod- LagenteSoft - The pioneer online casino solutions provider to change permissions). Doing a recovery using R-Linux looked promising… but there was a problem. It ran, and I could see my WD drive with all the files. Since VMWare Fusion allows a 30 day trial, I downloaded it, and created a Linux VM.
#LAGENTESOFT BATCHMOD MAC OS#
It runs on Windows and Linux, but I’m currently running Mac OS (don’t have a Windows machine on hand). I found out about R-Linux in an older thread, and it had worked for most people. I wanted to recover them if possible, and I ended up here in the forums. My WD NAS wouldn’t boot after a power outage (steady yellow light), and although I didn’t have any critical data in there, I had some large files that were not yet uploaded to my offsite backup. You’ll need some Linux proficiency, but this might prove useful. If your situation is similar to mine, this might help. Though I’d contribute the missing piece in my scenario.

This thread (special thanks to Ewald for mentioning the Linux e2tools) along with the old thread that’s around (mentioning R-Linux) helped me recover my data from my WD NAS drive.
