Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking β€œSign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

454 create readme for lme 20 #455

Closed
wants to merge 3 commits into from
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Upgrade 1x to 2.0 (#428)
* Adds scripts to import and export 1.x data

* Modifies the import script to use podman

* Adds the dashboard importer for 1.x to 2.0

* Updates the import and export scripts to add mappings

* Updates the field limit on winlogbeat index upon import

* Moves the upgrade scripts to a folder and requires directory on import

* Adds ability to remove the old docker volumes

* Puts the volume remover in the upgrade directory

* Makes the volume remover executable

* 2x readme

* Increase default maximum field limit

* Alter title of imported dashboards to indicate 1x import

* Clarify some points in the upgrade readme

* Read the passwords and username from the config file if it exists
cbaxley authored Sep 16, 2024

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
commit 626aa7544514ce4981aaa74d859ed06af9cb72b5
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -29,4 +29,5 @@ testing/tests/assets/style.css
*.vim
**.password.txt
**.ip.txt
**.swp
**.swp
**/quadlet/output
88 changes: 88 additions & 0 deletions scripts/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
# Upgrading from 1x to 2x
1. Checkout the latest version of the LME repository to your home directory
```bash
cd ~
git clone https://github.com/cisagov/LME.git
```
1. Export indices:

Note: *This may take some time witout feedback. Make sure it finishes successfully*

A successful completion looks like this:
```bash
Data and mappings export completed. Backup stored in: /lme_backup
Files created:
- /lme_backup/winlogbeat_data.json.gz
- /lme_backup/winlogbeat_mappings.json.gz
```
Run this command to export the indices:
```bash
cd ~/LME/scripts/upgrade
sudo ./export_1x.sh
```
1. Either export the dashboards or use the existing ones
- If you don't have custom dashboards, you can use the path to the existing ones in the following steps
```bash
/opt/lme/Chapter 4 Files/dashboards/
```
- If you have custom dashboards, you will need to export them and use that path:
```bash
# Export all of the dashboards, it is the last option
cd ~/LME/scripts/upgrade/
pip install -r requirements.txt
export_dashboards.py -u elastic -p yourpassword
```
- Your path to use for the importer will be:
```bash
/yourhomedirectory/LME/scripts/upgrade/exported/
```
1. Uninstall old LME version
```bash
sudo su
cd "/opt/lme/Chapter 3 Files/"
./deploy.sh uninstall

# Go back to your user
exit

# If you are using docker for more than lme (You want to keep docker)
sudo docker volume rm lme_esdata
sudo docker volume rm lme_logstashdata

# If you are only using docker for lme
# Remove existing volumes
cd ~/LME/scripts/upgrade
sudo su # Become root in the right directory
./remove_volumes.sh
# Uninstall Docker
./uninstall_docker.sh

# Rename the directory to make room for the new install
mv /opt/lme /opt/lme-old
exit # Go back to regular user
```
1. Install LME version 2x
```bash
#***** Make sure you are running as normal user *****#
sudo apt-get update && sudo apt-get -y install ansible

# Copy the environment file
cp ~/LME/config/example.env ~/LME/config/lme-environment.env

# Edit the lme-environment.env and change all the passwords
# vim ~/LME/config/lme-environment.env

# Change to the script directory
cd ~/LME/scripts/

ansible-playbook install_lme_local.yml

# Load podman into your enviornment
. ~/.bashrc

# Have the full paths of the winlogbeat files that you exported earlier ready
./upgrade/import_1x.sh

# Use the path from above dashboard update or original dashboards
sudo ./upgrade/import_dashboards.sh -d /opt/lme-old/Chapter\ 4\ Files/dashboards/
```
42 changes: 42 additions & 0 deletions scripts/check_password.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
#!/bin/bash

check_password() {
local password="$1"
local min_length=12

# Check password length
if [ ${#password} -lt $min_length ]; then
echo "Input is too short. It should be at least $min_length characters long."
return 1
fi

# Generate SHA-1 hash of the password
hash=$(echo -n "$password" | openssl sha1 | awk '{print $2}')
prefix="${hash:0:5}"
suffix="${hash:5}"

# Check against HIBP API
response=$(curl -s "https://api.pwnedpasswords.com/range/$prefix")

if echo "$response" | grep -qi "$suffix"; then
echo "This input has been found in known data breaches. Please choose a different one."
return 1
fi

# If we've made it here, the input meets the requirements
echo "Input meets the complexity requirements and hasn't been found in known data breaches."
return 0
}

# Main script
if [ -n "$CHECKME" ]; then
# Use input from environment variable
check_password "$CHECKME"
elif [ $# -eq 1 ]; then
# Use input from command-line argument
check_password "$1"
else
echo "Usage: CHECKME=your_input $0"
echo " or: $0 your_input"
exit 1
fi
179 changes: 179 additions & 0 deletions scripts/upgrade/export_1x.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,179 @@
#!/bin/bash

set -e

LME_PATH="/opt/lme"
ES_PORT="9200"
ES_PROTOCOL="https"

# Function to get the host IP address
get_host_ip() {
ip route get 1 | awk '{print $7;exit}'
}

ES_HOST=$(get_host_ip)

# Function to find the drive with the most free space
find_max_space_drive() {
df -h | awk '
BEGIN { max=0; maxdir="/" }
{
if (NR>1 && $1 !~ /^tmpfs/ && $1 !~ /^efivarfs/ && $1 !~ /^\/dev\/loop/) {
gsub(/[A-Za-z]/, "", $4)
if ($4+0 > max+0) {
max = $4
maxdir = $6
}
}
}
END { print maxdir }
'
}

# Function to clean up path (remove double slashes)
clean_path() {
echo "$1" | sed 's#//*#/#g'
}

# Function to check Elasticsearch connection and version
check_es_connection() {
local response
local http_code
response=$(curl -s -k -u "${ES_USER}:${ES_PASS}" -w "\n%{http_code}" "${ES_PROTOCOL}://${ES_HOST}:${ES_PORT}")
http_code=$(echo "$response" | tail -n1)
body=$(echo "$response" | sed '$d')

if [ "$http_code" = "200" ]; then
es_version=$(echo "$body" | jq -r '.version.number')
if [[ "${es_version}" =~ ^8\. ]]; then
echo "Successfully connected to Elasticsearch version ${es_version}"
return 0
else
echo "Unsupported Elasticsearch version: ${es_version}. This script supports Elasticsearch 8.x."
return 1
fi
elif [ "$http_code" = "401" ]; then
echo "Authentication failed. Please check your username and password."
return 1
else
echo "Failed to connect to Elasticsearch. HTTP status code: ${http_code}"
return 1
fi
}

# Function to export data and mappings using Docker and elasticdump
export_data_and_mappings() {
local output_dir="$1"

echo "Exporting winlogbeat-* indices data..."
docker run --rm -v "${output_dir}:${output_dir}" \
--network host \
-e NODE_TLS_REJECT_UNAUTHORIZED=0 \
elasticdump/elasticsearch-dump \
--input=${ES_PROTOCOL}://${ES_USER}:${ES_PASS}@${ES_HOST}:${ES_PORT}/winlogbeat-* \
--output=$ \
--type=data \
--headers='{"Content-Type": "application/json"}' \
--sslVerification=false | gzip > "${output_dir}/winlogbeat_data.json.gz"

echo "Exporting winlogbeat-* indices mappings..."
docker run --rm -v "${output_dir}:${output_dir}" \
--network host \
-e NODE_TLS_REJECT_UNAUTHORIZED=0 \
elasticdump/elasticsearch-dump \
--input=${ES_PROTOCOL}://${ES_USER}:${ES_PASS}@${ES_HOST}:${ES_PORT}/winlogbeat-* \
--output=$ \
--type=mapping \
--headers='{"Content-Type": "application/json"}' \
--sslVerification=false | gzip > "${output_dir}/winlogbeat_mappings.json.gz"
}

# Function to prompt for password securely
prompt_password() {
local prompt="$1"
local password
while IFS= read -p "$prompt" -r -s -n 1 char
do
if [[ $char == $'\0' ]]; then
break
fi
prompt='*'
password+="$char"
done
echo "$password"
}

# Main script
echo "LME Data Export Script for Elasticsearch 8.x"
echo "============================================"

echo "Using host IP: ${ES_HOST}"

# Check if Docker is installed and running
if ! command -v docker &> /dev/null; then
echo "Error: Docker is not installed. Please install Docker to proceed."
exit 1
fi

if ! docker info &> /dev/null; then
echo "Error: Docker daemon is not running. Please start Docker to proceed."
exit 1
fi

# Prompt for Elasticsearch credentials and verify connection
while true; do
read -p "Enter Elasticsearch username: " ES_USER
ES_PASS=$(prompt_password "Enter Elasticsearch password: ")
echo # Move to a new line after password input

if check_es_connection; then
break
else
echo "Would you like to try again? (y/n)"
read -r retry
if [[ ! $retry =~ ^[Yy]$ ]]; then
echo "Exiting script."
exit 1
fi
fi
done

# Determine backup location
echo "Choose backup directory:"
echo "1. Specify a directory"
echo "2. Automatically find directory with most space"
read -p "Enter your choice (1 or 2): " dir_choice

case $dir_choice in
1)
read -p "Enter the backup directory path: " BACKUP_DIR
;;
2)
max_space_dir=$(find_max_space_drive)
BACKUP_DIR=$(clean_path "${max_space_dir}/lme_backup")
echo "Directory with most free space: $BACKUP_DIR"
read -p "Is this okay? (y/n): " confirm
if [[ $confirm != [Yy]* ]]; then
echo "Please run the script again and choose option 1 to specify a directory."
exit 1
fi
;;
*)
echo "Invalid choice. Exiting."
exit 1
;;
esac

# Clean up the final BACKUP_DIR path
BACKUP_DIR=$(clean_path "$BACKUP_DIR")

# Create backup directory if it doesn't exist
mkdir -p "${BACKUP_DIR}"

# Export data and mappings
export_data_and_mappings "${BACKUP_DIR}"

echo "Data and mappings export completed. Backup stored in: ${BACKUP_DIR}"
echo "Files created:"
echo " - ${BACKUP_DIR}/winlogbeat_data.json.gz"
echo " - ${BACKUP_DIR}/winlogbeat_mappings.json.gz"
Loading