Skip to main content

Lessons I Learned (so far) Building a Cyber Range

 Long time no see! As we all gear up for the Christmas holidays, I figured I would circle back to this dusty old blog and tell you what I've been up to as well as why I have been radio silent over the last little bit. For those who do not know, I have been homelabbing since the summer and been pretty much flat-out with making things work in this lab. One of these things is a dedicated cyber range where defensive along with offensive operations can be launched, all in a safe environment that is ran on the infrastructure I have built.

If you're looking to build your own cyber range, this is the blog post for you as I will be covering all the lovely hiccups and things I have learned during my ongoing journeys. I will also be providing an opportunity for YOU to get your hands on the range if you wish to play around. Let's talk homelab!

The Tech Stack

Infrastructure

So for my tech stack, I learned fairly quickly that having a good hypervisor is crucial for setting up your cyber range. I tried both Hyper-V Server 2019 as well as VirtualBox, but ended up using Proxmox on top of my Hyper-V environment. The goal with this was to add an additional layer of virtualization so that the cyber range's infrastructure such as the DC and clients ran in a virtualized environment, not in parallel with any VM's on my server. Therefore, I can also further isolate it should the need arise. This was easier said than done as Windows Server 2025 and 2022 both require everything to be in check prior to it being set up for the first time, otherwise it will be bricked. It wasn't until I turned off KVM virtualization and switched to an evaluation copy of Windows Server 2022 that things finally worked. 

Once I had my DC up, it was time to turn my attention to creating some users. Thanks to a tool that I built (announcement TBD), I was able to create a full-fledged corporate directory complete with roles, managers, offices, and phone numbers. The goal here was to make it as realistic as possible to what you would see in a corporate environment. Once this was done, I then started creating some mock endpoints for simulations. Now with that out of the way, I then could begin looking into simulating user interactions! I used GHOSTS integrated with Swagger and Grafana to simulate realistic user traffic that can be fed to the SIEM. In this case these are the following logs I will be forwarding to the SIEM:
  • Windows Event Logs (auditing successful processes and the like) using Winlogbeat
  • Sysmon logs from Linux machines and other machines I decide to pull from
  • Network traffic logs from the endpoints (pending firewall appliance)
If you haven't heard of GHOSTS, then you're missing out. GHOSTS allows you to simulate various usage patterns such as sign-ins, file access, and other user events. Making it a prime choice for simulating user interactions that can then be fed to any SIEM you wish. I recommend checking it out sometime if you haven't already: GitHub - cmu-sei/GHOSTS: GHOSTS is a realistic user simulation framework for cyber experimentation, simulation, training, and exercise albeit it was a bit of a pain to set up at first, once it is all good to go it is pretty smooth sailing.

Now you may have noticed that the firewall is pending. This is because I am waiting for a small mini-PC that I will install OPNsense onto to send telemetry from any assets within my range VLAN. So this is a work in progress. I also just got a managed switch so that I can do VLAN tagging of my cyber range versus my server traffic so that will further isolate the range from my homelab to prevent any IP scans or lateral movement.

SIEM

In case you did not know, SIEM stands for Security Incident Event Management and is the tool used by security professionals to investigate security events. In this case, I wanted to go with Elastic and Kibana as these are both tools that I use within my homelab environment. You could also use other open-source tools like Wazuh or Security Onion to achieve the same thing, but one thing I love about the ELK stack is the ability to mend and meld dashboards to what you require. I also like how I can use KQL (Kibana not Kusto) to really get down and dirty with the data. For those who are not aware, Elastic can also be used for other purposes aside from security, which is the great thing about it! It is a data science oriented tool that allows for Kibana integration so that you can really build the reporting you desire.

I plan to eventually integrate other open-source SIEMs into the mix such as Wazuh and Security Onion, but as of right now I just want to get the range up and running. In the end, the range will contain a few different SIEMs to appeal to the appetite's of whoever is using it. If you know of any, or if you're a vendor willing to help, please let me know!

Ticketing and Knowledge Base

For the ticketing system, I created a Power Pages portal that teams can use to submit and track tickets as well as view a knowledge base. I plan to share this in a subsequent article, but the reason I went this direction was because I wanted to try using Power Pages and I didn't want to bog down my range with a bunch of different VM's running different things. If you do not have Power Pages, you can also use open-source ticketing systems such as osTicket which can be installed onto a virtual machine. Either way, it is important to have some form of ticketing and knowledge base system that analysts can use to record incidents as well as see written playbooks (all made by those who will use the range).

Lessons Learned Thus Far

Of course along the way I have learned quite a bit, in which I hope that these little tidbits of advice will help you to build your own range (or avoid the complications that I did):

  1. Use Proxmox to as the guest to your range. It doesn't matter if the host is using Hyper-V or another hypervisor, but the guest that your range will use should be Proxmox. I wasted quite a bit of time playing around with Hyper-V Server 2019 for it to only have a CLI and a bit of a learning curve. Proxmox has a web interface which offers much more features. So if you're looking for the path of least resistance, choose Proxmox VE as your guest.
  2. Be prepared for compatibility issues running Windows inside Proxmox. Since Proxmox is a Linux-based virtual environment, you will be required to load Windows drivers onto the VM's you create if you're using Windows Server or other Microsoft Operating Systems. VirtIO offers these drivers for free and isn't super hard to set up once you bang your head off your desk a few times. Just turn off KVM virtualization and you'll be fine.
  3. Find a really good tutorial for Elastic and Kibana installation. There are a few out there, but some of them are outdated so you may end up spinning your wheels on some of them. Even better, just ask Copilot. 
  4. Sometimes the best fix to a messed up Elastic or GHOSTS install is to just start clean and with some food in your stomach.
  5. Persistence is key: there is nothing more satisfying then seeing Elastic light up with telemetry after pulling an almost all-nighter getting it working.

So that pretty much covers everything I had to say in this one. If some of this went over your head, then that's okay, it did with me too in the beginning. Only way to understand this article is to do it yourself! This Christmas will be spent fixing up the odds and ends, then rolling it out to the first contingent of people. Until then, have a Merry Christmas, and a Happy New Year!






test

Using Power Automate to Update Contact Information

 We've all been there- you have a large organization who has out-of-date contact information. What do you do? You could go around to each department and ask them nicely to update their information, or send out an org-wide email prompting people to do so. However, this is tedious and oftentimes a pointless task. By the time you update one department, you're running to fix another. What if you could put the power back in the department's hands to do so? This is a struggle I faced recently as I was trying to find was I could conjure up some updated contact information for each department. As I did my research, I found that I was not alone in this endeavour as it seems that many IT professionals would love to make this process a little bit less painful. With this in mind, I introduce to you my latest flow! This flow will allow you to encourage users to update their contact information, without the overhead that comes with manual effort. In addition to this, this flow utilizes t...

Using Custom Connectors and Microsoft Graph API's to Manage Licenses in Power Automate - Part One

Happy June folks! I come to you with another post, but this time I wanted to change it up and show you something else I have just finished working on. As a SysAdmin, one of the most common issues we run into is managing licenses. Working at a post-secondary institution makes this an even greater challenge, as you have both students, staff and faculty constantly coming as well as going. Managing to keep up with this constant change can introduce great administrative overhead which takes away time from important upkeep of other systems and initiatives. You may also notice this same issue in large corporations or in other government organizations. To help combat this, I wanted to create a flow that can do the following: Get the user and their licenses Determine their last sign-in and the date Conditional to determine if the user is past the "cutoff" date Remove the user from a group where the license is assigned The only problem with doing this is that Power Automate does not ha...

Using Custom Connectors and Microsoft Graph API's to Manage Licenses in Power Automate - Part Two

Hello again! Didn't I promise you that I'd be back to wrap this up? Well, here I am to give you the second tidbit of information that you need to get this started. If you haven't already, take a look at my previous post where I go into depth about creating a custom connector in Power Automate to retrieve the latest sign-in and also gather the user's licenses. Now that we have the custom connector ready, we can now get into the meat n' potatoes of this series. In this post, I will show you the flow that makes this possible and how you can use the custom connector you have created to tie it all together! Hope you enjoy. Understanding the Logic Before we can begin creating the flow, we should first understand how the flow will work. I designed this to flow to be triggered manually, but you may want to schedule it or use another trigger. The trigger will depend on your organization's policies, so please adjust accordingly. Once triggered, the flow will use the Entra...