Building My ULTIMATE, All-inOne, HomeLab Server

Sdílet
Vložit
  • čas přidán 27. 05. 2024
  • Today I built the ultimate, all in one, HomeLab Home Server to handle everything.
    Sliger did send this case to me however asked for nothing in return.
    Other 4u Cases
    - Sliger CX4150a - www.sliger.com/products/rackm...
    - SilverStone RM44 4U - amzn.to/3K0wpmk
    - RackChoice 4U - amzn.to/3UB8bEf
    Other Parts
    - Samsung SSDs - amzn.to/3USTxtj
    - Corsair Airflow Case (newer) - amzn.to/44BV0HI
    - 10g Ethernet adapter - amzn.to/3wkN0hP
    - LSI HAB - amzn.to/3UWBuCN
    (Affiliate links may be included in this description. I may receive a small commission at no cost to you.)
    Video Notes: technotim.live/posts/ultimate...
    Support me on Patreon: / technotim
    Sponsor me on GitHub: github.com/sponsors/timothyst...
    Subscribe on Twitch: / technotim
    Become a CZcams member: / @technotim
    Merch Shop 🛍️: l.technotim.live/shop
    Gear Recommendations: l.technotim.live/gear
    Get Help in Our Discord Community: l.technotim.live/discord
    Tinkers channel: / @technotimtinkers
    00:00 - What I want out of a HomeLab Home Server
    01:19 - Selecting a case / chassis
    02:23 - Use Old case?
    02:58 - New or Reuse?
    03:33 - Other Case Options (Zack Morris style)
    03:51 - Thinking about Hacking this chassis
    04:19 - CPU & Motherboard
    05:39 - Disassembling
    06:48 - Component layout
    08:11 - How to get 15 SSDs in here
    08:57 - Maybe print some parts?
    09:45 - For now, it's jank
    10:24 - Test flight
    11:02 - Power usage
    11:37 - Testing components with an OS
    12:18 - Networking
    13:02 - Temperature checks
    13:30 - Testing GPU
    14:45 - SSDs are here
    15:19 - Racking Server
    15:56 - Weird Gap
    16:20 - Selecting the operating system
    Thank you for watching!
  • Věda a technologie

Komentáře • 289

  • @TechnoTim
    @TechnoTim  Před 18 dny +73

    Sorry about the mistake by saying 5.25" drives! While researching and testing, I was trying to figure out how many drives I could fit in the Corsair's 5.25" bays and somehow that got into my script. 🤦‍♂ In the spirit of mixing things up, let me know what you've mixed up before!

    • @amateurwizard
      @amateurwizard Před 17 dny +3

      Please try Unraid. It is very different in ways I'd like you to show people. I finished an ITX build on Wednesday and by now (
      Friday) I have an entire arr stack with multiple instances of certain containers running. Even while being pretty busy at work.

    • @janhebi
      @janhebi Před 17 dny

      i was about to comment on that xD yeah i mixed stuff up too i cant come up with anything rn thou
      great video btw

    • @joshhaas8121
      @joshhaas8121 Před 17 dny

      Gave up on floppy disks along ago

    • @actng
      @actng Před 17 dny

      leaving mistakes in is a surefire way to drive engagement lol ppl love to tell you when you're wrong hahaha

    • @aaronlindsey1942
      @aaronlindsey1942 Před 14 dny

      I hate typing "disk" in front of someone at work and accidentally typing "dick"

  • @corrpendragon
    @corrpendragon Před 18 dny +105

    I, also, hate using 5.25" hard drives. Such a pain ;)

    • @yuan.pingchen3056
      @yuan.pingchen3056 Před 18 dny +12

      I know, the Quantum Bigfoot, it's a nightmare, it's even not have the ultraDMA mode.....

    • @TechnoTim
      @TechnoTim  Před 18 dny +8

      I had 5.25 on my mind because I was trying to see how many drives I could fit in the Corsair case's 5.25 bays when writing this🙃

    • @corrpendragon
      @corrpendragon Před 18 dny +4

      @@TechnoTim how many can you fit?

    • @williamp6800
      @williamp6800 Před 16 dny +1

      Better than using 8” floppies

    • @MorganTN
      @MorganTN Před dnem

      @@yuan.pingchen3056 LOL I remember those I only had one in my time... I also remember MFM/Winchester drives from the days of AT/XT days.

  • @YakDuck
    @YakDuck Před 18 dny +48

    Hey Tim, I’m hard of hearing, but I just want to say thank you for your time and effort into adding subtitle😊

    • @TechnoTim
      @TechnoTim  Před 18 dny +13

      No problem! I try my best everywhere, even on websites with A11Y!

  • @techaddressed
    @techaddressed Před 18 dny +34

    Your public library might have a 3D printer if you don't want to purchase one. I use my library's printer often.

  • @hakunamatata324
    @hakunamatata324 Před 17 dny +4

    4:40: x16 will only run as x8 and x8 runs as x4 if you are using a feature that shares the same lane, x16 and NVMe Gen4 slot for example.
    It's important to know your mobo limitations like how many PCIe and which features shares the same lanes.

  • @Krushx0
    @Krushx0 Před 16 dny +6

    For an ultimate all-in-one homelab server, a hypervisor whiteout even thinking. One solution (nearly) fits all.

  • @evertgbakker
    @evertgbakker Před 18 dny +16

    200 Watt idle. Where I live (Netherlands) that's about 500€/year.

    • @subukai
      @subukai Před 18 dny +5

      ouch. typical USA kwh price is .15 USD about 262 USD a year for 200 watt / year

    • @asishreddy7729
      @asishreddy7729 Před 13 dny

      You can thank the failed sanctions on Russia for that.

  • @jeremymigonis2498
    @jeremymigonis2498 Před 18 dny +107

    5 1/4 DRIVES?? They're 3.5 inch!

    • @TechnoTim
      @TechnoTim  Před 18 dny +50

      Oof! What the heck was I thinking when I wrote this. I think this snuck into my brain because I was playing around with my old Corsair case and was trying to figure out how many drives I could fit in the 5.25 drive bays 🤦‍♂

    • @corrpendragon
      @corrpendragon Před 18 dny +7

      ​@TechnoTim we've all been there, lol!

    • @ickyendeavors4179
      @ickyendeavors4179 Před 18 dny +3

      @@TechnoTim I think I still have some old SCSI or maybe RLL 5.25" HDDs. Just in case you need one that has a sum total of 32Mb. (megabytes).

    • @TylerTroglen
      @TylerTroglen Před 18 dny +4

      He was using Quantum Bigfoot drives ;)

    • @BrentUpton1
      @BrentUpton1 Před 18 dny

      Beat me to it!

  • @ASFokkema
    @ASFokkema Před 17 dny +6

    Don’t go with the EVO 870’s, I made the same mistake (They are consumer drives and wear out quickly!) I replaced all of them with the Samsung SM883’s (ZFS pool)

  • @questionablecommands9423
    @questionablecommands9423 Před 11 dny +1

    15:18 When racking servers by myself, I've found that there's usually holes in both the sliding part of the rail in the rack, and the stationary part of the rail (the portion that attaches to the rack itself). Every rail is different, so it always takes some experimentation, but I've found that I can put a spare screw/toothpick/pointy-thing through both holes so that sliding part of the rail doesn't push back while I get things lined up. I do this for both sides but sticking out different amounts so I can line things up one side at a time. Just be sure that the holes you pick in the rail can be reached from the front of the rack.

  • @blinkitogaming
    @blinkitogaming Před 18 dny +3

    I’ve run for years an unRaid server which had a w10 VM with GPU and NVME passthrough that I used for playing games and the rest of the system was used for docker stuff: Plex, arr suite, homeassistant and a large etc.
    Just make sure you have a Renesas chip based USB PCIe card passed to the VM so you can plug and unplug peripherals without freezing the VM.

  • @RyanMcGuinness
    @RyanMcGuinness Před dnem +2

    Probably too late to the conversation but recommend sticking with Proxmox but using SRIOV to pass through part of the GPU to multiple machines

  • @redhonu
    @redhonu Před 18 dny +11

    I went bear metal on my home lab server for a while, because it could do everything I wanted. However, things changed and now I’ve reinstalled everything on proxmox. The overhead is low and you have the flexibility anything in the future. So I would just install a hyper visor of you’re choosing.

    • @Marsh.x
      @Marsh.x Před 3 dny +1

      Grizzly, Black or Brown ?

    • @redhonu
      @redhonu Před 3 dny +1

      @@Marsh.x Sorry i didn't clarify, black bear metal of course.

  • @dieseldrax
    @dieseldrax Před 18 dny +6

    Also, thanks for the info on the Sliger cases. You just made me spend more money, they look great and are made in the USA for a reasonable price. My Threadripper platform is getting a new home. :)

  • @haxwithaxe
    @haxwithaxe Před 18 dny +5

    Proxmox or xcpng are what I'd go with. It's nice to not have projects competing for ports or service configs.

  • @Docmeir
    @Docmeir Před 18 dny +2

    I moved to a single giant server build a while back from my own giant rack with dell power edge servers. But it was to much power usage. For most use cases in a single server build. I found unraid to be the best base OS for me. 5 years later. Still rock solid and never had any major issues. Kind of on autopilot and it just works.

    • @CampRusso
      @CampRusso Před 17 dny +1

      Been running unRAID for 🤔 3+... I've lost count. 😆 The UI makes it's super easy to manage everything.

  • @marcogenovesi8570
    @marcogenovesi8570 Před 18 dny +23

    Try out Unraid. It's imho the best OS for a single system. file shares, VMs, passthrough, containers and ZFS support

  • @XTJ7
    @XTJ7 Před 13 dny +1

    "I want a machine that does everything" 1 minute later "I already have a NAS" :P
    To say something more productive: I prefer buying used enterprise drives because their TBW rating is vastly higher. Even compared to high end consumer SSDs it is often between 3 to 9 times higher. And it gets so much worse if you compare against QLC drives. Considering things like write amplification and RAID you can run through a lot more written data than you expect. That is probably a trade-off you made intentionally, considering you have a separate NAS that can serve as a backup target, but it is something to keep in mind for others attempting to replicate it.

  • @CampRusso
    @CampRusso Před 17 dny

    Same here. 😁👍
    Taking a play from my corporate sys admin world. Separate storage and compute boxes.
    Going to build a TrueNAS Scale box as the central storage for the entire homelab. Then use the 2nd unRAID lic to build a fresh compute server. Both will have 10Gbe until i can swap for fiber.

  • @minipuft
    @minipuft Před 7 dny

    getting a long pcie riser cable and making full use of the CPU you have seems like the best move imo

  • @NielsenPhotos
    @NielsenPhotos Před 14 dny

    Hey Tim, great build can't wait to see how that works out, Always looking for projects like this. I ended up with the iStarUSA D-410-DE36 case that allows for 36 drives a while back for the hotswap trays.Paired it with NORCO RPC-4224 4U with uses 24 3.5in drives. Purpose was to run Flash NAS and backup to spinning disk NAS. If you end up making 3d printed drive holders for this project, I would love see that adventure.

  • @DPCTechnology
    @DPCTechnology Před 18 dny

    Awesome stuff, thanks for sharing!

  • @Justfun-nk3vj
    @Justfun-nk3vj Před 17 dny +2

    With 'only' PCIE gen 3.0, you can hardly call it the 'Ultimate' HomeLab server 2024.

  • @robertboskind
    @robertboskind Před 18 dny +2

    I don't think this is the ideal project to try Unraid on, but you really need to try it if you haven't. I always have at least a couple of servers in my rack and love changing things out, but the Unraid box is always there

  • @nonamesi
    @nonamesi Před 14 dny +1

    @5:04 I don't know cpu lane calculation is as simple as you explain. you should probably check motherboard info, how many lanes are used for internals like lan, usb, ...

  • @sachasmart7139
    @sachasmart7139 Před 14 dny

    Yes! Thanks for the change in content

  • @NilsRatusznik
    @NilsRatusznik Před 18 dny +1

    I can’t remember landing the plane in Top Gun on the NES. Well done ! 🎉 As for the OS, I would vote for Proxmox or a regular distribution like Debian, Ubuntu or a RHEL clone. You could use Cockpit if you want to spin up VMs on them.

  • @atomycal
    @atomycal Před 18 dny +3

    Proxmox for the win!
    No reason why, I just love it.

  • @Sossingro
    @Sossingro Před 17 dny

    I rackmounted my PC a week ago. The chassis was less than (the equivalent of) £100, including rails, and fits my 3 chunky radiators in too. My office is so much cooler and incredibly quiet now.

  • @NigelDev
    @NigelDev Před 18 dny +1

    So apparently we are on the same tech wavelength and I just now noticed it, lol. I too just built the a few servers to do all the things. I ended up going with an Epyc system for all 128 of its PCI lanes dumped into a HL15. Another machine made use of the SilverStone RM41-506 4U chassis. I needed the 5.25" bays for the tape drive and an Icy Dock 4x 2.5" ssd hot swap cage, but the GPU just fits :)

    • @DrDipsh1t
      @DrDipsh1t Před 18 dny

      I've been eying the dual epyc setups on ebay to eventually migrate everything to. I have absolutely no need for all 128 lanes or cores/threads, but it'll be all I'd ever need and then some.

    • @NigelDev
      @NigelDev Před 18 dny

      @@DrDipsh1t Wow yea dual Epyc would be Epic 😂 I am actually making pretty good use of the PCI lanes on my 7282 16c/32t rig. A 16x PCIe card that holds 4x NVMEs, 16 port HBA, Radian RMS-200 Edge Card and a dual port 10gig SPF+ nic. Still have 2 more NVME m.2 slots, 2 OCuLink ports as well as 2 mini sas connectors on the motherboard I could populate. Its an Epyc8d from AsRock Rack board.

  • @BillyBurtonTech
    @BillyBurtonTech Před 14 dny

    When you were talking about the power supply with those giant connectors you've never seen before, you really proved to me how old I actually am, so thank for that. LOL. Looking forward to seeing how this turns out.

  • @jonjohnson2844
    @jonjohnson2844 Před 18 dny +1

    1 upvote for Unraid - just get in before they get rid of the perpetual licenses (if they haven't already), it has such a good community and (for me) has been the best solution for basically just running a zillion Docker containers.

    • @CampRusso
      @CampRusso Před 17 dny +1

      I bought a 2nd lic right after I saw that announcement. 😆

  • @ronm6585
    @ronm6585 Před 18 dny +1

    Thanks Tim.

  • @Der089User
    @Der089User Před 18 dny +5

    Built a homlab server as well - ran Unraid for a while which was in the end a patchwork of tools for functions that a server system should already have integrated. And running from a USB stick is not something I want to rely on.
    So I decided to run Proxmox which is the most flexible system in my eyes. It's lightweight, can run VMs or LXCs, passthrough hardware, it's reliable and definitely the more professional choice.

    • @Hansen999
      @Hansen999 Před 15 dny

      It's basically only the config files stored on the USB stick.
      Unraid will load its config into RAM and run from there.
      For reference my Unraid system has 19,877 reads and 7341 writes to the USB stick and it's been in operation for 2 years with a lot of changes.
      Should my USB stick fail, I can just download the automated backup to a new USB stick, recover my license and I'm up and running again.

    • @Der089User
      @Der089User Před 15 dny

      @@Hansen999 Thanks for the information! I know the story. Used it for three years-

  • @MrBcole8888
    @MrBcole8888 Před 18 dny +12

    Switch to a fiber SFP+ transceiver and it will run much cooler. The RJ45 copper ones run really hot and use a lot of power.

    • @hassell7530
      @hassell7530 Před 17 dny +1

      DACs are a nice alternative as well.

    • @CampRusso
      @CampRusso Před 17 dny

      I have a dual 10 gig E card in my unRAID and noticed first hand hot toasty it is. Now I wish I didn't give away the SFP card 🤦🏻‍♂️

  • @thatdudeinorange6786
    @thatdudeinorange6786 Před 18 dny +3

    Try Unraid! I love it and is very user friendly

  • @peteradshead2383
    @peteradshead2383 Před 18 dny +2

    With the price of electricity I like to keep the 24 hour a day wattage to a minimum , about 50-60 including a rtx2060 super .
    I have a Ryzen 5700g with 4 x 4 tb SSDs and 64gb ram , but no ecc memory and only 20 lines .

    • @pesfreak18
      @pesfreak18 Před 18 dny +1

      I plan to build a server with the 5700g and 64 GB RAM as well. Do you have any numbers of the power usage without the video card? I thought about and suggest that it will be around 25-30W but numbers are hard to find online (atleast for me). Would appreciate it if you could help :)

    • @peteradshead2383
      @peteradshead2383 Před 18 dny +1

      @@pesfreak18 switch all the power boost modes off and which 2 x 4tb SDDs and 2 x nvme 2tb drive , for a small nas on deskmini x300 just set up as a samba server 15 watts according to my power monitor plug on idle.
      But my other system on a B550 motherboard , with the gpu and about 10% cpu all the time about 50-60 watts , but I think I must be the only person who is running a 5700g water cooled with a 360mm cooler , but it was a remove my x570 motherboard with a 3950x because I could not get it below 120 watts on idle .
      16 core 32 treads was a little over kill for what I wanted my home server to do , in fact I think the 5700g is over kill.

    • @pesfreak18
      @pesfreak18 Před 18 dny +1

      @@peteradshead2383 15 watts seems very efficent. Water cooling is a little bit overkiil for me too but it sounds like a cool project. The Ryzen is maybe overkill but I want to experiment with gameservers and maybe AI in the future. I think I can utilize the CPU well enough with these tasks.

  • @cloufish7790
    @cloufish7790 Před 17 dny

    15:25 - Interstellar - Docking Scene
    LLama3: Endurance Rotation is 37,64RPM. It's not possible
    Techno Tim: No. It's neccessary

  • @flahiker
    @flahiker Před 18 dny

    Interesting. I am literally building a AMD Threadripper 7960x using a Sliger 4170i case with the ASetek 836SA AIO cooler. Just waiting on the case to be delivered to start the build.

  • @Blaq_Out
    @Blaq_Out Před 15 dny

    Love my Sliger Cases. Have the CX4712 for my NAS and CX4150a for my desktop. I will probably buy a CX4200a for a gpu upgrade though. 3090FTW BARLEY fits with a low-profile radiator. It just fits a credit card between them.

  • @nomercyriding
    @nomercyriding Před 17 dny

    I had the same idea a few months ago (all encompassing build with a lot of PCI), and I ended up going with a barebones Dell Precision T5820. I threw in an Intel Xeon W-2140B and some ECC RAM. Don't sleep on repurposing used workstation hardware!

  • @TazzSmk
    @TazzSmk Před 18 dny +1

    I'm contemplating similar build, and here are my observations:
    for proper local AI selfhosting, having Mac Studio with 64GB unified memory is MUCH more suitable than RTX 3090 which is limited to "only" 24GB vram,
    that said nVidia gpus utilizing CUDA are often faster than Apple's neural engine,
    all reasonably-priced Xeons are limited to PCIe 3.0 which is not futureproof at all, and already bit of a bottleneck for current gpus,
    any strong gpu takes lot of space and covers most PCIe slots on the motherboard, so it's rather difficult to decide on PCIe expansion layout, preferably gpu on "lowest" x16 slot, which then needs bigger than ATX case,
    that said some local AI tools can utilize multiple gpus or even multiple computers, so there's a delicate (cost/efficiency) balance between running let's say 4 gpus in one rig, 4 computers connected via 100Gbe together,
    latest Windows Server or plain Windows Pro, with WSL (Subsystem for Linux) seems best OS for widest range of local AI applications - arguably better manageable via Proxmox but with undesired performance loss,
    to sum it up, one all-in-one server doesn't seem that efficient, depending on what to run on it

  • @davemeech
    @davemeech Před 17 dny

    This describes EXACTLY what I want to do (in the intro anyways). Perfect workout watch.

  • @ExpressITTechTips
    @ExpressITTechTips Před 18 dny

    im tempted to threadripper my next all in one homelab build. but this is a good and cheaper alternative I feel

  • @computersales
    @computersales Před 18 dny

    Sliger offers watercooling options for the Threadripper/Epyc line of CPUs in that chassis. I'm not aware of any constraints that would stop you from using random AIOs as long as they aren't too thick for your GPU clearances. There is a build someone did in that case with a thick 360MM cooler and a 3090ti.

  • @djplasma02
    @djplasma02 Před 15 dny +1

    Tim, great video! Have you considered exploring Proxmox on this server and demonstrating GPU passthrough? It would be valuable to see which remote software is optimal for accessing VMs, and perhaps even conduct a gaming test. Looking forward to your future content! Greetings from Bosnia :)

  • @LackofFaithify
    @LackofFaithify Před 17 dny

    Xeon. ZFS. Consumer SSDs (granted they are dirty RZATs). 10gig Ethernet. AST2500. Anythihg but the RTX 4000 SFF ADA. I see great moments of joy and power efficiency in your future.

  • @techpchouse
    @techpchouse Před 17 dny

    Unraid is definitely good option to test. A lot of great features and the team behind working really great to push wishes from the community

  • @ickyendeavors4179
    @ickyendeavors4179 Před 18 dny +1

    I'm kind of surprised waiting for Intel 15th Gen. Right now, that seems to be a really losing strategy based on their power problems and lane management. While you can say "only a few more lanes" on the Ryzen, those lanes can have real impact in what's available to you, if you are looking for onboard 10Gb plus multiple PCI-E x16. For most users, you might find that you can get a Threadripper Pro 3, 5, or now 7 series available (new or used) and you can end up with a WRX80 board or WRX90 board and get 128 PCI-E X16 lanes, which will cover absolutely anything you've ever thought about. I would put that into high contention in your list.

  • @neccros007
    @neccros007 Před 18 dny +1

    5.25" hard drives?? Sign me up!!

  • @inflatablemicrowave8187
    @inflatablemicrowave8187 Před 16 dny +1

    Strap a fan to that hba. It needs airflow. Look up the cfm needed for it and youll see why, or touch the heatsink after running under load for a while

    • @BoraHorzaGobuchul
      @BoraHorzaGobuchul Před 6 hodinami

      Perhaps not the best idea to touch it, particularly when it's installed like that where it's bound to get pretty hot

  • @ewenchan1239
    @ewenchan1239 Před 18 dny

    If you didn't have the hardware already, I would've recommend with a Threadripper system because it offers more PCIe lanes.
    (That's the direction that I'm heading down, except that I'll likely end up with something like an 8U chassis and then using PCIe risers/extensions so that it won't cover the rest of the slots. My other option will be a Supermicro 4U or 5U GPU server, but those are limited to dual-slot-wide GPUs only, which means that my 3090s will be blocking some of the other slots.)

  • @purgalimited
    @purgalimited Před 14 dny

    Kewl setup, especiallt delivery of ssds is funny 😂 they weren’t stolen

  • @fwiler
    @fwiler Před 18 dny +2

    So tired of them artificially holding back pcie lanes and also holding back on bifurcation of pcie lanes on consumer cpu's and motherboards.
    For the 2.5" drives- print out a bar that goes across the top of the drives and long enough to touch both sides of the case. There would be small ridges printed into the bar at each drive location to keep them in place. That way you don't block air flow. And you could easily lift off if you need access to drives.

  • @IconicDavexD
    @IconicDavexD Před 17 dny +1

    I would go with Unraid for a build like this as it's easy to use and have a lot of features and if you really want to tinker, it's running on linux and have a terminal for any custom tinkering :)

  • @aaron57422
    @aaron57422 Před 18 dny

    If you do find a way to convert those 3.5" bays to 5.25", Wendell has shown some interesting enclosures that adapt 5.25" bay to 2.5" or nvme flash backplanes

  • @TheAmericanMuffin
    @TheAmericanMuffin Před 13 dny

    Question, is that power supply able to handle the power spikes 3090s are infamous for? Genuine question

  • @Fiftykilowatt
    @Fiftykilowatt Před 17 dny

    Custom watercooling would give you back some more lanes. But i am maybe a bit to much in love with wc 😂

  • @oscarcharliezulu
    @oscarcharliezulu Před 17 dny

    Ooh look at that sliger case! Very pretty! PCI lanes is why I’m still using x299 systems :/

  • @Sebahk
    @Sebahk Před 18 dny

    I'm excited for the software part! What would you say are the disadvantages of proxmox in a build like this? Doesn't that give you more flexibility?

  • @levifig
    @levifig Před 18 dny +3

    100% add this as a worker node to your K8s cluster! No need to have a different management layer!
    K8S ALL THE THINGS!!! 🤘

  • @owNewBlood
    @owNewBlood Před 15 dny +1

    You mention the AMD in the script that has more PCI express lanes but in the video showed 20,16 then said "same number" but actually the Ryzen series have Native PCIe Lanes (Total/Usable)
    28 , 24.

  • @MrBrutalmetalhead
    @MrBrutalmetalhead Před 18 dny +1

    love the video. just setup some ai local following network chuck video it works awesome. cant wait for your take

  • @ClayBellBrews
    @ClayBellBrews Před 12 dny

    There are lots of makerspace’s that have 3D printers. Some public libraries as well.

  • @trexgamer73
    @trexgamer73 Před 6 dny

    Nice video!

  • @howardleen4182
    @howardleen4182 Před 18 dny +2

    Proxmox or Ubuntu please!

  • @blitzio
    @blitzio Před 8 dny +1

    Would love to see you try Unraid out!

  • @jefff7316
    @jefff7316 Před 18 dny +1

    omg the top gun carrier lmao. I've never once landed it as a kid.

  • @blakestandal8294
    @blakestandal8294 Před 5 dny

    I'm about to rebuild my server and I'm actually really curious what OS you're gonna run because it may influence my decision. I really didn't enjoy truenas because setting up apps wasn't intuitive at all. Curious about unraid but not thrilled about the price. Proxmox seems logical but I wanna be able to add more storage later on (Ideally) into the same pool. Cool build! Will be looking forward to the part 2!

  • @bufanda
    @bufanda Před 18 dny

    If you look at the schematics again iot says it right there which Slots are which for example the first x16 Slot says CPU SLOT6 PCI-E 3.0 x( (in x16) so that is an x8 in a x16 connector which is connected to the CPU and the last x8 says PCH Slot1 PCI-E 3.0 x4 (in x8) so it's a PCI-E via the PCH and only has 4 Lanes. So in any case the GPU only will run on 8 Lanes on any of the x16 connector slots as both are only x8(in x16).

    • @TechnoTim
      @TechnoTim  Před 18 dny

      Thank you! Yes, I actually edited out the part where I explained that, I probably should have left it! When I did the math you can see I adjusted for it (8 x 6 = 48, and the last was 4 in an 8)

  • @csgrullon
    @csgrullon Před 17 dny

    There are companies that offer 3D printing services; if you are not going to be printing stuff regularly, could be a good idead to look into one of those services.
    Great video, love your content.

  • @nicholaushilliard6811
    @nicholaushilliard6811 Před 23 hodinami

    Tim ever consider RTX 4000 Ada. Has 20GB RAM and AV1 encoding all in a SFF single slot and 75W power consumption. Should be boss enough to play most games.

  • @Nairbener
    @Nairbener Před 17 dny

    i would go baremetal and install cockpit and cockpit-machines for the VMs with lxc img

  • @james-cucumber
    @james-cucumber Před 18 dny +2

    Tiny subtitling correction at 2:05. I’m pretty sure you just left a gap in speech, rather than starting a new sentence.
    Generally though, your subtitles are very good. Thank you for taking the time to do them.

  • @darthkielbasa
    @darthkielbasa Před 18 dny

    16:55 do it! Unleash that hardware, sir!

  • @DMBrownlee
    @DMBrownlee Před 13 dny

    You mentioned this will not be taking over your NAS role. What about your firewall? I've seen some folks virtualize pfSense or vyos, but I don't think I would be comfortable hosting a firewall/ids/vpn on the same machine in case the virtualization solution has a security issue.

  • @dieseldrax
    @dieseldrax Před 18 dny

    I've flip-flopped between Proxmox and Unraid for some years now. Started with Proxmox, went with Unraid on the next build, went back to Proxmox on the most recent build. Both have their pros and cons, but for my use case Proxmox made more sense. Unraid is pretty similar to TrueNAS Scale, however my experience with TrueNAS Scale hasn't been all that great.
    For example, Unraid supports Docker containers natively, Proxmox uses LXC. So, if you want to run Docker containers on Proxmox you have to spin up a VM or LXC to run Docker as I'm sure you know. Unraid Docker containers primarily come from the "Community Apps" catalog, however there is a beta app that installs Docker Compose to make spinning up custom containers easier. Both Proxmox and Unraid support ZFS, however Unraid is designed specifically to manage individual drives itself (Like TrueNAS) whereas Proxmox is more storage-agnostic.
    If you haven't used Unraid before then I would recommend taking it for a test drive, it might be just what you're looking for or the solution that you didn't know that you needed. No single platform can do everything and do it well, but based on your build I'd say Unraid would definitely be a good option to test. One problem I did have with Unraid, and maybe it was user error, is that I could never get Windows 11 VMs to work, even when enabling TPM, etc. It wasn't a blocker as I didn't NEED to run Windows 11, however with Windows 10 EOL looming I did want to make sure I had a path forward, so when I built my latest server I started with Proxmox and the first thing I did was install a Windows 11 VM. Worked perfectly. So for now I'm back using Proxmox.
    I'm not sure what I'm going to do with my old server, I'd like to put it into a rackmount case and mount it under the new server but as for an actual use for it...the only thing that comes to mind is turning it into a dedicated storage/backup system. On the one hand that seems like a waste of a Threadripper 1950x, on the other hand not using it is also a waste. The other option would be to sell it and build something a bit more recent on a more common platform. Decisions, decisions.

    • @CampRusso
      @CampRusso Před 17 dny

      Been running a Win11 VM in the current version of unRAID. Works good for the simple tasks I need it for. 😁👍 Now you have to try it again. 😉🤣

  • @adreanvianna9569
    @adreanvianna9569 Před 18 dny

    Yea try incus! Changed my homelab life up! Run it on Debian

  • @Byrthor
    @Byrthor Před 18 dny

    Thanks, Tim. I thought I was over getting traumatized by the carrier landing in Top Gun lol

  • @jumpmaster5279
    @jumpmaster5279 Před 18 dny

    Can I please have the stl files for the ssd cage you showed 9:07, or atleast guide me to the closest model for personal use

  • @vjo03
    @vjo03 Před 16 dny

    "I dont want anything to be flopping around"... 15 minutes later: Everything is flopping around

  • @WhyDoesNothingWork
    @WhyDoesNothingWork Před 18 dny +2

    creating a dedicated VM with the GPU is kind of limiting IMO, I find that using the GPU in lxc containers is the best way, you can have multiple containers use it at the same time and even monitor its usage from the host os

    • @BoraHorzaGobuchul
      @BoraHorzaGobuchul Před 6 hodinami

      Or split the card, but that requires finesse and doesn't work with all cards

    • @WhyDoesNothingWork
      @WhyDoesNothingWork Před 2 hodinami

      @@BoraHorzaGobuchul very few cards suppprt thay and generally they're expensive and require licenses

  • @BLiNKx86
    @BLiNKx86 Před 18 dny +2

    Lol that saved-by-the-bell timeout

    • @1beerbaron
      @1beerbaron Před 18 dny

      Same, though it took me a few seconds to realize why it looked so familiar.

    • @TechnoTim
      @TechnoTim  Před 18 dny

      Thanks! Yeah, a late edit. I actually deleted the background and then added it back. no regets

  • @erickgruis106
    @erickgruis106 Před 18 dny +1

    I'm curious how your SSD pool works out. I have a Truenas Scale setup with 6 2TB Samsung EVO's and I've struggled to get decent performance. Literally it's less than a single disk. I tried Stiped 2 x (3 raidz1), 3 mirrored vdevs and raidz1 across 5 of them.
    Hopefully you can do a follow up video on your SSD pool setup and final performance numbers.

    • @clintbishop9145
      @clintbishop9145 Před 16 dny

      Consumer models performance will drop I believe because of the drives cache, PRO models address this. And enterprise SSD's will be far superior. At the end of the day, you get what you pay for...

  • @kretzooo
    @kretzooo Před 17 dny

    Hi. I would like to know your opinion on how long the SSDs would last compared to regular HDDs on a more intensive writing/reading application. Let's say rtorrent with 2-3TB of data. I have a dedicated HDD just for this purpose that is passed directly to the VM and I am reluctant to change it for an SSD.

    • @TechnoTim
      @TechnoTim  Před 16 dny

      Thanks, the way I look at it is that all drives have a lifespan, some fail faster than others, and others out live their lifespan. Since I look at all drives through the same lens (whether SSD or spinners) I just make sure that they have a long warranty and have one spare just in case. These Samsung drives have 5 year warranty and out of the 30+ EVO/Pro Samsung drives I have bought over the years, only 2 failed and Samsung replaced them within a week. For this reason, I typically go with consumer grade.

  • @deamit6225
    @deamit6225 Před 15 dny

    So why are u preffering 48 pcie 3.0 lanes over 20-24 pcie 5.0 lanes?

  • @UNcommonSenseAUS
    @UNcommonSenseAUS Před 18 dny

    What software are you using at 13:49 to manage all your vm's

  • @-Good4Y0u
    @-Good4Y0u Před 17 dny

    This is the road I've been going down

  • @stevedoescomputerstuff

    Supermicro manuals will tell you how the pcie slots are wired, even if you don't use a slot that's say wired for x4 buts its in a x8, you don't magically move those lanes to somewhere else, they are stuck there. I'm willing to bet two of those x8 are actually x4, I haven't looked at the manual for that board but that's more likely.
    Nevermind I just saw in the diagram the last slot is wired x4, and those 16s are just x8. So your gpu will most likely get a small bottleneck.

  • @jjarechiga
    @jjarechiga Před 18 dny

    Do a Harvester install, opensource kubernetes based IAC for containers and VMs with immutable nodes.

  • @Feriman
    @Feriman Před 15 dny +1

    What about your server in the DC?

  • @thomhodgson2721
    @thomhodgson2721 Před dnem

    For most people, a used Lenovo P920 is the ticket.

  • @JeffGeerling
    @JeffGeerling Před 16 dny

    I'm just now noticing the shirt. I like it, haha!

  • @devYT92
    @devYT92 Před 17 dny

    @technotim what cpu cooler is that?

  • @darrenoleary5952
    @darrenoleary5952 Před 18 dny

    TT's gone back to the 90's... 5.25" drives

  • @levifig
    @levifig Před 18 dny

    The last PCIe slot on that motherboard is served by the PCH and it’s a 4x, so it doesn’t “count” for the CPU lanes total. ;)

  • @sutekhxaos
    @sutekhxaos Před 15 dny

    as an avid Unraid fanboy, dont put unraid on it lol. IMO Proxmox is the way to go. cant beat that flexibility!

  • @BrunodeSouzaLino
    @BrunodeSouzaLino Před 17 dny

    It's hard to believe that single "my humble homelab" video might be the whole reason this video is being made...

  • @jdrakehoffman
    @jdrakehoffman Před 18 dny

    I think you might like 3d printing!

  • @Spinnen
    @Spinnen Před 18 dny

    So where did we end up on power usage?