Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
5n2 port bonding with Netgear R8500 and drive migration issue.
03-27-2017, 12:45 PM
Post: #1
5n2 port bonding with Netgear R8500 and drive migration issue.
I just got my 5n2 and have some issues.

#1 I updated firmware on both units and moved my 5n drives over. 5n2 works but says it is in rebuild mode with 49 hours to go. Data is accessible but I did not expect rebuild mode. I installed the drives in the same order and everything.

#2 I have enabled bonding on the 5n2 and it shows the same IP address for both poets but on my netgear router r8500 it says link aggregation is inactive. I am using the correct 2 ports on the router and have aggregation enabled. Does the drobo not use the same protocol? The bonding was the reason I upgraded so my feelings are very mixed right now.

Any ideas?
Find all posts by this user
Quote this message in a reply
03-27-2017, 03:29 PM
Post: #2
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
Seems the netgear r8500 wants me to configure the device as 802.3ad. Is that the default? How do I set the teaming mode?
Find all posts by this user
Quote this message in a reply
03-27-2017, 06:22 PM (This post was last modified: 03-27-2017 06:22 PM by Paul.)
Post: #3
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
hi sbushman, can you remember how much data you had on the drobo before the migration?
am not sure about the rebuild, (is there a possibility that a drive had not fully spun down and was ejected shortly after powering off the 5n?)

i guess more importantly, did your data and shares come back up soon after powering up the 5n2? (if so, it may just be that a 'regular' coincidental hard drive issue was observed by the 5n2 and it may just be doing its usual protection features for you)?

btw you may have seen this already, though there is a page here (with some more links too) that could help a bit? :
http://manuals.drobo.com/#t=5N2_Connecti...http://manuals.drobo.com/#t=5N2_Connecting_Cables_and_Powering_On_Your_Dro

(btw i have XP home SP2, a Drobo v1 with 2x 1TB/2x 1.5TB WD greens, & a bkp Drobo v2 with the same + a DroboShare: unused)
& a DroboS v2 with 3xWD15EADS &2x1TB in DDR mode on win7, & a drobo5D (all usb)
  • btw i did a sustained (write) operation for about 6 hours, and got 13.2MB / sec ...objection? "sustained" :)
    (16.7MB/s on a v2 & 47-96MB/s drobo-s)
Find all posts by this user
Quote this message in a reply
03-27-2017, 08:21 PM
Post: #4
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
The shares are there, and accessible, but after 8 hours the rebuild is still reporting 19, then 21, then 29, then 21 hours. (keeps shifting).

The drobo 5n was off when i removed the drives, so they were spun down. The new Drobo 5N2 was updated and rebooted before I inserted the first drive. It took a little while for the drives to be detected. The last one took the longest.

I had forgotten to bring over the hot cache ssd, so powered down and installed it. Powered up again and still rebuilding. My wife will shoot me if her artwork is compromised! lol. Plus I am a photographer and all my work from the past 3 years is on there.

As for port bonding, is this thing using 802.3ad like my Netgear router wants? Seems the synology, qnap and readynas all have selectable modes for this. The drobo does not.
Find all posts by this user
Quote this message in a reply
03-29-2017, 02:51 PM
Post: #5
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
thanks for more info,
am not too sure about the port bonding, but the value for remaining rebuild time can fluctuate (though maybe not as much as windows explorer bars).

the general base value i usually take is about 1day per 1TB of data that is on the drobo. (accessing the drobo can slow the rebuild down a bit, though newer models might be quicker to rebuid than that) - its good that you can still access things and if you can bear in mind how much data you have on your drobo, it should hopefully finish in about that much time in 1 day per TB though please let us know how things go

(btw i have XP home SP2, a Drobo v1 with 2x 1TB/2x 1.5TB WD greens, & a bkp Drobo v2 with the same + a DroboShare: unused)
& a DroboS v2 with 3xWD15EADS &2x1TB in DDR mode on win7, & a drobo5D (all usb)
  • btw i did a sustained (write) operation for about 6 hours, and got 13.2MB / sec ...objection? "sustained" :)
    (16.7MB/s on a v2 & 47-96MB/s drobo-s)
Find all posts by this user
Quote this message in a reply
03-30-2017, 03:37 PM
Post: #6
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
Rebuild completed yesterday and drobo is humming along, but still not sure about the bonding. Anybody else have it working? What kind of router/switch are you using?
Find all posts by this user
Quote this message in a reply
04-01-2017, 06:31 PM
Post: #7
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
I too have an issue with the bonding. I've opened a support ticket and will post the outcome.

I'm using an Edge Router Lite with a Dell Power Connect 5548 switch.

With the bond turned on, I get %80 packet loss and latencies above 250ms when pings finally happen. I disable the bond, all returns to normal. I even tried LACP/LAG on the switch but that resulted in %100 loss. I didn't expect that to work.
Find all posts by this user
Quote this message in a reply
04-02-2017, 12:05 PM (This post was last modified: 04-02-2017 12:33 PM by jrinehart76.)
Post: #8
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
(03-30-2017 03:37 PM)sbushman18 Wrote:  Rebuild completed yesterday and drobo is humming along, but still not sure about the bonding. Anybody else have it working? What kind of router/switch are you using?

sbushman18, I'm curious what your specific issues are. When I configure aggregation on my switch the Drobo is totally inaccessable. Even in a port-channel, with 1 NIC disconnected it works fine but when both are used in the "bond" mode, it's useless.

I have a feeling this "bond" idea isn't fully baked yet. Still waiting on support to reply.


EDIT: I have resolved my particular issue. It seems that the Drobo has no knowledge of LACP or LAGs in the typical network world. Once I set my port-channel to STATIC instead of LACP, I was able to get both ports working. Also, it's my understanding that there is no switch configuration needed in most environments so I would try turning off aggregation on your r8500 and see what happens. Hope that helps.
Find all posts by this user
Quote this message in a reply
04-04-2017, 10:21 AM
Post: #9
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
THanks a lot for that JR76. I disabled aggregation and just did two test transfers, one in each direction and got pretty much 110+ MB/s sustained speed. The original 5N used to peak around there but was closer to 70-80 sustained so I'm seeing much better (closer to theoretical max over gigabit) speed now.

I may just get a new managed switch and move all of my devices over to there. I have 20 devices split between 3 8 port unmanaged switches and the router, which is not ideal. In fact, the last 3 ports on the router share 1Gb/s worth of bandwidth between them.

I am thinking I could use the aggregate ports on the router to go to a 24 port managed switch, and then hook everything else in there.

Anything to gain by going with jumbo frames? or will I cause my streaming clients grief by doing that?
Find all posts by this user
Quote this message in a reply
04-04-2017, 12:19 PM
Post: #10
RE: 5n2 port bonding with Netgear R8500 and drive migration issue.
(04-04-2017 10:21 AM)sbushman18 Wrote:  THanks a lot for that JR76. I disabled aggregation and just did two test transfers, one in each direction and got pretty much 110+ MB/s sustained speed. The original 5N used to peak around there but was closer to 70-80 sustained so I'm seeing much better (closer to theoretical max over gigabit) speed now.

I may just get a new managed switch and move all of my devices over to there. I have 20 devices split between 3 8 port unmanaged switches and the router, which is not ideal. In fact, the last 3 ports on the router share 1Gb/s worth of bandwidth between them.

I am thinking I could use the aggregate ports on the router to go to a 24 port managed switch, and then hook everything else in there.

Anything to gain by going with jumbo frames? or will I cause my streaming clients grief by doing that?

Great news! Glad you got it working!

My philosophy on Jumbo frames has always been they aren't needed unless it's iSCSI storage traffic (think iSCSI datastores in VMware) or 10G (or greater) ethernet. In your network, Jumbo frames are just an added level of complexity with no gain. And keep in mind, anything that would use the Drobo, assuming Jumbo, would also need to support MTU 9000 and greater. Which further complicates things compatibility wise, especially if you are hanging a WAP off your switch/router with wireless clients accessing the Drobo.

Of course, there isn't anything wrong with doing Jumbo Frames. Just and added layer of complexity that I wouldn't want at home. If you try it and you get faster/better results, please share!
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump: