Trouble with transferring large files to HFS+ volume

I have a RPi model B connected to my TV and to a lan router with a network cable. Also attached is a powered USB-hub with a small USB-powered hard disk, formatted as HFS+ (non-journaled otherwise Raspbian can't work with it) because my other computers are Macs and that is convenient. There are no reliable ext4 or (free) NTFS drivers for OS X. Can't use FAT because some files are larger than 4 GB and don't want to re-encode. Haven't tried exFAT because the Linux support seems sketchy:

formatting link

I installed hfsplus support: "sudo apt-get update && sudo apt-get install hfsplus hfsutils hfsprogs". The disk mounts automatically when plugged in: "/dev/sda2 on /media/usb160 type hfsplus (rw,nosuid,nodev,relatime,umask=22,uid=0,gid=0,nls=utf8,uhelper=udisks)".

Now almost invariably when I try to transfer large files to the RPi-connected hard disk from other computers (by scp, either from my desktop which is also wired to the router, or from my laptop which has a wireless connection to the same router), there is some sort of file system error which makes the disk remount as read-only, also onto the other interface alternating between sda and sdb. Solution is then to do "sudo fsck.hfsplus /dev/sdb2", unplug and replug the disk and it's read/write again. So what I end up doing, to avoid endless such errors, is schlepping the disk to my desk and copying the files there. Tedious.

The disk works fine when slowly reading large files, ie. remains mounted as r/w when playing videos from it.

From reading the RPi website forum, I believe a couple of months ago there was a spell with basic USB driver trouble. So I suspect it might be a Raspbian USB driver problem, overloading/crashing because all data is both coming in and going out on USB at the same time at full speed. Of course, it could also be a hfsplus file system driver problem, but from the outside that one seems more evolved.

Has anyone had similar trouble with file transfer over the network to an external hard disk? Or with hfsplus? Ideas to solve or avoid the problem? Thanks.

[Also posted to
formatting link
]
Reply to
A. Dumas
Loading thread data ...

...snippage....

I don't run Macs but haven't had any similar problem with LinuxWindows transfers. Some suggestions:

- reformat the disk as ext4 and use scp, sftp or ftp to do file transfers rather than directly accessing the disk from a Mac.

Not directly relevant, but I recently discovered that the excellent gftp FTP client can also act as an sftp client (select SSH2 rather than FTP as the transfer protocol). Is there a Mac port of gftp?

- Use a NAS box in place of the USB disk. Most of these run Linux internally and so should not suffer from the 4Gb file limit and should be accessible to both your Macs and the RPi over your home network.

Get one that supports RAID and can make offline backups to a USB drive and you've also got a nice backup solution for all your computers.

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

OSX does have full NTFS support, but write support isn't enabled by default, but does exist.

formatting link

But I would recommend a NAS box as well rather than trying to deal with a global volume type that you can shuffle around between different machines.

--
Doug McIntyre 
doug@themcintyres.us
Reply to
Doug McIntyre

I already use only scp to transfer files when the disk is connected to the Pi (because ssh-ing anyway to use omxplayer). But I still want to be able to use the disk directly on USB on my other computers (for maintenance, larger/multiple transfers, portable file storage out of the house, whatever). That won't work with ext4.

Not prepared to spend ?100 or so on this.

Thanks for your reply.

Reply to
A. Dumas

Thanks. Like the article says, probably not fully reliable. Now, the data on my drive obviously isn't very critical but it'd still be a pain to lose it.

Reply to
A. Dumas

The r/w option for NTFS can work, but only with smaller files. I had real problems transferring large, multi-gigabyte, files.

I had much better success with the hfsplus driver in linux. I managed to copy several tens of gigs onto a usb disk. This was on with a desktop linux machine, so it could well be your problems are associated with USB issues on the RPi.

Reply to
chris

Thanks for adding.

Reply to
A. Dumas

Hi Alexandre, Did you find any solution to your issue ? I have kind of same issue with NSLU2 under Debian "wheezy". I have an hfsplus usb disk mounted on the debian server and share with samb a mainly on my Mac mini. Reading large files create no issue but copying large filed from the mac mi ni to this shared partition cause the NSLU2 to hang up at 460,2 MB. I have tried to disable the journaliing on the hfsplus but it doesn't help . I made a try to copy file with scp but it was much slower and finally it ha ng up at 580 MB. I can copy large file on other samba share that are ext3 and no issue. I have no clue into which log/debug file(s) I should start looking first to understand what"s going on, what is creating the server to hang up (i can still ping it but can't connect anymore). is it a memory size issue ? So let me know if you have made progress from your side. Thanks Uriel

=udisks)".

Reply to
uriel.auerbach

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.