Receiving by Transmission on Ubuntu? You mean, just file transfer, or play? There are nicer torrent clients on Ubuntu... but I guess there's value in just having one torrent-based architecture for everything. Would like to see the details of this setup actually.
One plus is that the exact same configuration can be opened up to wider file sharing when NFS or other more typical file sharing systems would start to have problems with their security and bandwidth assumptions - no such issue with torrents!
Ah, so that explains it. But why not sell them off and get a much better - like 10Gbps or at least 4Gbps - home network? They really aren't giving you a lot of bang for their value in this config.
Which was my only reason not to run VMware ESXi... hardware limits. Support for many common cheap cards like the D-Link gigabit ethernet cards was withdrawn in 5.5 and 6.0 and 6.5, it's nowhere near as flexible as ESXi 5.1. Which is a reason to use virtualBox and Zen etc, not VMware, if you have no real reason to run Windoze.
VMs obviously need to control some graphics capabilities in some media-intensive apps, but an atlernative would be to run a 100Mbps Ethernet port to a smart TV and use virtual video driver. Which I assume VMware ESXi supports somehow... unless... this is the kind of thing I didn't want to look up, so I went with bare-metal Ubuntu server that hosts OpenZFS just fine.
This is very nice and exactly what I would have put around my AM3 motherboard if I was buying new. Notice that my actual build https://pcpartpicker.com/user/Craig_Hubley/saved/CnBvK8 also had a 450W PS, a mid-tower case, a smallish (360GB in mine, 500GB here) boot drive, cheapo optical and silent-at-all-costs video.
I would however dual-boot the machine as there are many things Linux diagnoses far better than Windows (like memory faults) and you always want to be able to boot the machine whatever Microsoft does to your firstborn/license/activation.
Where I think you've made a bit of a mistake, though, is the video card. It's going to draw more power because it won't Crossfire with your APU, there's almost no point in having a 5450 in any box that already has the A8-5600K in it, since the GPU in that APU is better than the 5450's and a 5450 just does nothing to enhance it.
While the 5450 comes in silent models and is good for the DAW/NAS/DB-host/VM-hypervisor purpose I had for it, it just costs power and slows your graphics down! Put that card in a host box.
AMD requires you to match APU and GPU fairly closely, or it used to, to take advantage of the power-saving features. It also goes without saying that the GPU add-in card should be more powerful than the APU's own inbuilt GPU! When the GPU on the card and GPU on the APU cooperate properly, the connections on the motherboard should be used for the primary monitor(s) so that the card's ports can be turned off the save power, while only its GPU functions run (and that only if needed). Only in 3+ monitor configurations do you use the card ports. So you should blow a little extra cash and do what http://www.overclock.net/t/1385808/can-i-crossfire-an-a10-5700-processor-with-a-radeon-hd-5450-graphics-card suggests. They note that "the highest [the A10-5700] can dual graphics with is the 6670, some say that the 7670 will be supported since its pretty much the same card, same goes for the 7750."
The A8-5600K according to http://www.cpu-world.com/info/AMD/Recommended_graphics_cards_for_AMD_dual-graphics.html has a HD 7560D in it and can crossfire with an HD 6570 but NOT a 6670. Also with the 7570 (both DDR and GDDR5 models) and 7670, 7650A and 7670A. It may be a challenge finding a silent version of those, but they exist. Actually your A8-5600K has a very wide range of compatible cards, it's just that the 5450 isn't one of them...
Crossfire to the APU has a great power advantage in that the (compatible) card can be off when it's GPU cores are not needed and its ports may never be needed - if this was done properly and if you kept your storage on the Ethernet, cut the power supply to a quieter 300W one I think. I'd ditch the optical drive and go for a smaller case.
There's even a way to get to 200-250W: A 256GB PCIe SSD can be had for under $100 if you wait around, and that's suitable for OS and apps if you keep your own data on the network, cloud or USB external storage (128GB sticks now $20 each!). I am not sure I want spinning metal in a workstation any more, it just adds vulnerabilities.
Not seeing the "media serving" side of this... that box has specs like a database server. If you want to serve up 4K media on several streams but make the box out of garbage from the dumpster, try boosting from this build https://pcpartpicker.com/user/Craig_Hubley/saved/CnBvK8 [in that I didn't really use a 1GB silent video card, it's a 256MB ATI 2600 HD Pro... from the garbage]. My SSD is the intel 600p, a M.2 for under $100 (but with a strict write limit so you don't use it for swap or temp, just OS/apps), and junk 'green' drives in a mirror config (because they're unreliable for anything else).
I wonder why Hardware RAID? Software disk processing in OpenZFS works great... hardware RAID is scary, you won't be able to find that card/ROM again if it blows, and good luck recovering your data. Better write down every cylinder detail of the RAID config if you want it to survive a card loss.
I wonder also why OS X anything? Solaris and FreeBSD are free... and they are designed for this stuff. Apple is designed to suck money out of you, and nothing else. What works in BSD, they cripple in OS X. Apple doesn't even support ZFS or OpenZFS itself, they want to sell you on their half-baked HPFS+.
The lack of networking oomph is surprising. This motherboard is dual-GbE but I got three GbE into my machine to separate file serving from other uses (like firewalling) and would go to five GbE (to make the host a router also) under TrueOS or Ubuntu Server or even ESXi (with like a $70 intel 4-port card from usedservers.ca or something) in preference to speeding up the CPU. Also some cards take a lot of load off the CPU...
So while this nice i7 build would saturate 220MB/s with that processor and storage, is it worth that much money to do it? Two much cheaper FreeNAS boxes would do the same. Unsure... I'd like to know how much the ESXi and virtualization slow it down compared with metal install. I almost did ESXi but first I want to get the ZFS storage and NFS clients established and the network optimized...
The box mandate is confusing: "media" serving? What media needs this kind of firepower to serve? At 220MB/s you can serve up 90Mbps 4K streams to >20 people at once... with the right network / TVs.
Remember, even a Seagate Central with its pitiful CPU and laughable excuse for an OS can serve up 1080p fine over a gigabit network. Smart TVs only have 100mbps Ethernet, because that's all they need (raw throughput of a DVD 720p of the disk is 10.01Mbps, meaning, 9x that for a 2160p picture is just 90Mbps...<100Mbps).
So for "media serving" purposes in say a hotel a well optimized network should have a managed (likely PoE-capable) 100Mbps switch, dedicating a port to each 4K TV, with room for VoIP phones, security cameras... go to 1Gb PoE if you want, it's out there (and US$8/port for 6-port passive 12 or 24 volt)...Keep your dual-GbE devices separated on their own backbone/subnet. One big host like this i7 monster per hotel floor would do you for a while... but it seems overkill even for that need.
https://pcpartpicker.com/user/Craig_Hubley/saved/CnBvK8 is as close as I get to a ghetto build, so far, about half the parts came from the garbage. Cardboard case really has me thinking, I suspect it would baffle noise far better than old metal cases... if not... throw more cardboard around it...
For ghetto legit I should note that the first ten times I turned this build on, it was lying on its own cardboard motherboard case from ASUS, and had I found a lower profile video card and CPU cooler, I would have used that as the actual case. This could be a thing. Motherboards are shipped in extremely hefty cardboard. Although the static could be a problem...
I'd say it's a feature.
Intel 600p SSDs lock up and become read-only after a certain number of terabytes of write. Planned obsolescence. If your cardboard PC is on so much that it burns up, that's like a sign from the fan gods to go get a real case. LOL
I bet you could make enough to buy several top of the line PCs just by putting a webcam on this thing and taking bets on when does it burn itself down.
My money's on "never" at 90W.
OK there is a serious flaw in PartPicker which is the inability to add parts not in the lists.
For instance I want to do a NAS box build using the Corsair 380T case (t's there) and the Gigabyte 9SiSL mainboard (it's not there but here it is http://b2b.gigabyte.com/products/product-page.aspx?pid=4988#ov
Please do not hand-wave this away as a "server" board since 4x gigabit LAN ports has every use under the sun, and the PCIe x16 slot can handle whatever graphics you need, and 4x DIMM slots lets you re-use old RAM you'd not be able to use in a 2x DIMM mini-ITX build or SODIMM mini-ITX build, so there is lots of reason to get the server board. Also the Corsair case can hold 2x 3.5" drives so you can RAIDO your storage if you want to go cowboy, or RAID1 it if you want safety, or just put old drives in it counting on one not to fail, or an old and a new drive. Hauling this thing around to hook up to other people's 4K screens with a killer video card in it seems like a pretty good plan, smarter than "gaming laptops" which, face it, are just a joke. If they were any good, none of us would be here, right?
At least a few such "server" boards are great general purpose build. Here's one with two 10GbE ports and two 1GbE ports, and Xeon D-1521 that takes 2133MHz DDR4 http://b2b.gigabyte.com/products/product-page.aspx?pid=5761#ov
plus there's a PCIe slot... and who really needs more than one for the graphics these days? No one... a single board does 3 screens now and face it, you are going to use 3x1080p screens not one 4K, right?
You want a zombie party, get a few of these, hook up the 10GbE ports all to each other in a ring, then the other 10GbE port to the host game server, into which you toss this http://b2b.gigabyte.com/products/product-page.aspx?pid=5251#ov
then sell all your external hard drives, QNAP, Seagate Centrals, Thunderbolt nonstandard crap... because you're going to boot FreeNAS on it when you're not gaming http://freenas.org http://grid.referata.com/wiki/NAS#open_but_dedicated_NAS_options
All that oughta shave off a few ms of pingtime...
Why is the PSU better?
I like the liquid cooling and dead silent box - cardboard really muffles sound too. I'm all for the cardboard or non metal casing, cuz it just doesn't make much noise if it's foam or corrugated...