Thursday 23 April 2020

Get Azure Key Vault Data into Splunk

I want to learn more about Azure, and I love Splunk, so I wanted to see how I could get diagnostic data from Azure Key Vault into Splunk.

There's seemingly a million different ways to get log data out of Azure, with different levels of logging showing you different things.

This article: https://docs.microsoft.com/en-us/azure/security/fundamentals/log-audit does a good job of explaining the various logs found in Azure.

For today however, I am going to focus on Azure Key Vault Diagnostic Logs, streamed to an Event Hub and then picked up with Splunk using this Splunk App: https://splunkbase.splunk.com/app/4343/#/overview

In my particular scenario I have the following Azure resources:

- A blob type storage account
- An Event Hub namespace and Event Hub
- An Azure Key Vault configured to send diagnostic logs to the Event Hub
- A Splunk Instance

In my case I needed to create a storage account and blob container first:



Next up we need to create the Event Hub Namespace and Event Hub, which will point to our blob storage we created above


And then the Event Hub itself:


Here we are turning capture on, and then selecting the container:



Now we're ready to point our Key Vault to this newly created Event Hub:



We want to enable both AuditEvents and AllMetrics, and select Stream to an Event Hub, and then pick our Subscription, Event Hub namespace, hub name and policy name:



Now that we have our Key Vault sending to our Event Hub, the next step is to configure Splunk to pick it up using the app mentioned at the start of the post:


The values here are pretty straight forward:

- The index value tells Splunk where to put the data
- The Azure Storage Account Name is the name that you gave your stroage account when it was created
- The Storage Account Access Key Can be found here:


- The Container Name is the name that you gave to your container within the capture settings of your Event Hub

Now let's retrieve a value from our Key Vault to generate some logs:


The Key Vault data includes metrics data on API hits and latency times:


As well as operations performed on your Key Vault:



The data also includes IP addresses of the clients making the request, as well as application used and other information:


Now you can get some extra visibility into what's going on with your Azure Key Vault. Nice.

The "correlationId" field for these events can be joined together with other Azure Events for a more comprehensive view and more advanced queries (perhaps another blog post :) )

Some notes:

- I'm still learning Azure, so I have no idea if this is the "best" way to get Azure Key Vault logs into Splunk, I know Azure Monitor is a thing but I wanted to see if I could just get logs in from the Key Vault
- Check out PowerZure which will help you generate some interesting data to look at: https://posts.specterops.io/attacking-azure-azure-ad-and-introducing-powerzure-ca70b330511a
- I created all my resources in the same Azure region
- Check the networking settings of your Storage Account if stuff doesn't work

Friday 20 March 2020

Edit Your Sysmon Config in Style

Does this look familiar ?


Until very recently, I was there too.

Notepad++ is a fantastic tool that I use for hours every day, but it's not ideal for editing large Sysmon config files, I recently looked at other options and found a setup that I was happy with - let's take a look at how you can replicate this.

To begin, grab a copy of Visual Studio Code ( https://code.visualstudio.com/download )

Navigate to the Extensions menu and search for and install the Sysmon extension:


We'll also be installing the "Bookmarks" extension:



Next up, install and setup git for Windows ( https://git-scm.com/download/win )

During the setup, you'll have the option to use VS Code as the default edit for git, I chose this option:


Now restart VS Code and start a new Sysmon config, the Sysmon extension will help you here:


In my case, I had a private GitHub repo set up for my Sysmon config, so I want to set that repo up and then save my new blank Sysmon config file to it:


Now save your config file and give it a name, make sure to save it to the folder where you initialized your Git repo, in my case C:\Repos\sysmon-config

You should see a "1" icon next to the source control menu option in VS Code after saving your config file. Navigate to that menu, add your commit message, stage the changes and then commit them to your GitHub repo:


When you are done committing your changes, push them via the terminal or GUI, you should now see your changes in GitHub:

Now your Sysmon config is source controlled and resides in a feature-rich text editor with auto-complete unique to Sysmon's config syntax, nice!

As a final step, use the Bookmarks add-on to configure bookmarks within your Sysmon config so you can follow along with the various sections in a large config file.

If you've installed the bookmarks extension, navigate to the line where you want to set your bookmark and hit CTRL+ALT+K (Or right click --> toggle bookmark), in my case I'm going to bookmark my ProcessCreate exclusions section:


You can also rename this bookmark to your liking, I've found this kind of bookmark layout time saving:



And that's it -- a nice upgrade from Notepad++

Notes: 

- Thank you https://twitter.com/Carlos_Perez for the awesome VSCode Sysmon extension
- For great Sysmon configs, check out:
https://github.com/SwiftOnSecurity/sysmon-config
https://github.com/olafhartong/sysmon-modular



Sunday 15 March 2020

Wrangle Your PowerShell Transcript Logs with Apache Nifi

Intro:

You are an enterprise defender who has done the right thing and enabled PowerShell transcript logging. Now you have a whole bunch of flat text files containing PowerShell transcript logs, filled with various application noise. What do you do now?

Enter Apache Nifi:

Apache Nifi is a data routing & transformation system and is available to download here: https://nifi.apache.org/download.html - this post will show you how to use Apache Nifi to extract relevant text from PowerShell transcript logs and send them to whatever logging system you have, in my case I will be using Splunk as an example.

Installation:

First let's get Apache Nifi installed. You'll need to grab the install file for Nifi as well as OpenJDK

Nifi Download: https://nifi.apache.org/download.html

OpenJDK: https://jdk.java.net/

In my example I will be running this on a Window system, but Nifi supports various OS's and has a Docker image as well.

Extract the contents of OpenJDK into C:\Java, then edit your environment variables to include the bin directory under C:\Java\:

If you've done this step correctly, you should be able to open a new Command Prompt and execute the java command.

Next, extract the contents of your Apache Nifi zip file into whatever directory you want, then navigate to the bin directory and execute 'run-nifi-bat' you should see something similar to the following:


Making a Flow:

At this point Nifi should be running, and you should be able to browse to http://localhost:8080/nifi and see something like this:


The icons near the top left Nifi logo are the various processors, funnels and labels that Apache Nifi uses to do it's data wizardry - you drag the elements onto the 'canvas' - for our purposes we will be making a basic flow using processors only. When a processor is dragged to the canvas, Apache Nifi gives you options as to which processor you want to use. We will be starting with the "GetFile" processor:


Let's take a look at the properties of this processor:

We are telling this processor to look in the PSLogs directory, recursively. The Keep Source file option is set to false here, so Apache Nifi will delete the files when it is done processing them, flip this switch to true if you are just testing your flow, but keep in mind Nifi will continuously process your flow until you stop it.

Next we want to use the Extract Text Processor to perform regular expression matches against the contents of the logs that our GetFile processor read, we hover over the GetFile processor until the arrow icon appears, then drag this to our ExtractText processor:


When you successfully link the processors, a menu will appear asking you switch relationship you want to linked, for the GetFile processor the only relationship type is success, so we select that option:


Next up we want to configure the ExtractText processor to perform certain regular expressions on the text that was just extracted:


The + symbol allows us to enter our own properties, I have added three sets of regular expression matches for basic potentially undesirable PowerShell commands and have enabled DOTALL and Multiline modes.

In the settings tab of the ExtractText processor, I have set it to automatically terminated unmatched flows, and to route matched flows to our next processor, AttributesToJSON:




At this point, we now have Apache Nifi getting the contents of a file, performing regex matches against that file, then converting those matches into JSON format. Now that we have our matching data, we need to send it somewhere. For developing your flow, you may want to use the PutFile processor to just write your flow contents to a file for testing. In my case I have ended this flow with a PutSplunk Processor:


Note if you're using Splunk, you'll need to create an index and listener first, in my case I have a listener set up on port 1234 with a source type of JSON w/o a timestamp.

Since the PutSplunk processor is the last in my flow, I will tell it to terminate both success and failure relationships:


The flow should resemble something like this when you're all done:


Now locate the play button in the "Operate" window in the Nifi interface and watch your flow do it's thing:


Now let's take a look at the data in Splunk:


The field names highlighted correspond to the custom attributes you created in the ExtractText processor (I should have used much better names) and the field values contain the regular expression matches.

The file metadata is also included.

Here's what the data looks like with a little bit of formatting:


Notes: 

- My original use-case for looking at Nifi was to take a bunch of PowerShell transcript logs, do regex on them and then send only matches to Splunk in order to save on quota
- I am not an expert in Nifi at all, so there's probably more efficient ways to do what I'm trying to accomplish
- Huge thank you to https://twitter.com/Wietze for helping me work through some Nifi issues
- I have not yet tested this in a full-blown production environment, but from limited testing Nifi seems to chew through large transcript logs without issue

More on PowerShell logging:

https://www.fireeye.com/blog/threat-research/2016/02/greater_visibilityt.html
https://devblogs.microsoft.com/powershell/powershell-the-blue-team/



Wednesday 26 June 2019

(Very) Basic Elastic SIEM Set up

Recently Elastic announced the release of a SIEM product. In this post I'm going to do a very basic set up and brief overview of the product.

Some caveats first

- I usually set up ELK in lab environment, so this post doesn't cover any security settings for ELK
- I don't use ELK day-to-day so there's probably a bunch of stuff that can be done differently / more effectively

With that out of the way, let's dive in:

 The first thing you need to do is install Elastic and Kibana. If you follow the official guide, the links take you to version 8.0 alpha, at time of writing these links seem to be broken and my setup is with version 7.2

For my setup, I'm installing Elastic and Kibana on an Ubuntu 18.04 box, and some beats packages on a few Windows hosts.

To get Elastic on Ubuntu: https://www.elastic.co/guide/en/elasticsearch/reference/current/deb.html

To get Kibana on Ubuntu: https://www.elastic.co/guide/en/kibana/current/deb.html

Once Elastic and Kibana are installed, a few tweaks to the Elastic config were needed:



Without the last line in the above screenshot, starting elastic with the network host set to 0.0.0.0 error'd out, in my case I am only using one Elastic node.

I had to make a similar teak in the Kibana configuration file:


If all works well,  you should see similar Curl output (Note that I am not using localhost or 127.0.0.1 as the IP, if the curl command works with the actual IP of the box you're hosting Elastic on, then the beats should have no issues connecting):


At this point you should have Elastic and Kibana up and running, now let's get the beats installed.

The following page has links to all the beats shippers you would need: https://www.elastic.co/guide/en/siem/guide/current/install-siem.html

In my case I'm only using Winlogbeat + Packetbeat

Here is what the relevant sections of my winlogbeat.yml look like:


If you are installing packetbeat on a Windows host, make sure to grab WinPcap first, then follow the relevant instructions for set up. One thing I noticed with packetbeat is that if you are executing packetbeat.exe in a PowerShell window as per instructions, it sometimes doesn't display errors, if you run it from a standard command prompt, it will spit errors out on the console window.

Also, make sure you are selecting the right network device for packetbeat to sniff from:



Now for the good stuff, assuming you have Elastic, Kibana and your beats set up correctly, you should be able to browse to: <IP>:5601 and you should see a home page similar to the following (The SIEM optioin is highlighted in yellow)



Here is what my "Hosts" page looks like:


Now let's look for some simple events, you can use the search bar at the top to input your query.
On one of my test machines I just ran:

Invoke-Webrequest www.google.com 

In the below screenshot, I'm trying to find that event in the Sysmon logs:

If you hover over the three dots in the Events table,  you can see the full event details:


Now let's use this event to dig a little further using the timeline on the right hand side

When you click the timeline text, a menu pops up and you can drag and drop data to the timeline:



This gives us a timeline view, using our source IP as a filter:


From here you can pin events, filter further down and view the raw JSON. Let's filter our events down a little further, you can build reasonably complex queries pretty quickly using drag and drop. The GUI here is really intuitive.


Let's go back to our PowerShell example via the timeline, this time I'm putting in the source and destination IP from our previous query into the timeline:


If we look from the bottom up, we can see that a NetworkConnect Sysmon event was detected, followed by some TCP traffic, we can pivot off the destination IP for more info, clicking the IP hyperlink within the timeline brings us to this window:

Here you can tweak the dashboard to show the IP as a source or destination.




In the above, we can see the destination IP was google.ca so we know that PowerShell connected to Google.ca

Elastic SIEM also supports JA3 hashes, so if we look at an IP that established a TLS connection we see the following:


Again we can use the timeline to drag the JA3 element to it, to see what other systems may have established the same TLS handshake:


Hopefully this post helped demonstrate a really basic set up with the new Elastic SIEM. I'm sure I've hardly scratched the surface for what's possible.