Skip to content

ax-i-om/rapture

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Rapture

Go Report Card v0.3.6 DeepSource
No-nonsense data-breach search interface
Read the Guide


Table of Contents

Information

About

Rapture is a simple and cross-platform, "no-nonsense" data-breach search interface designed to enable individuals to effortlessly index and query compromised data to identify exposure and protect personal information and assets.

Attribution

This software is heavily inspired by MiyakoYakota's search.0t.rocks (Github | Archive), uses the same Solr configuration, and remains consistent with their core tenants of simplicity and usability. Thank you for your grand contribution to the development of open-source intelligence software and for providing a means for the community to expand on your developments.

Disclaimer

It is the end user's responsibility to obey all applicable local, state, and federal laws. Developers assume no liability and are not responsible for any misuse or damage caused by this program. This software comes AS IS with NO WARRANTY and NO GUARANTEES of any kind. By using Rapture, you agree to the previous statements.

AI Disclosure

The Rapture social card/preview uses content generated by OpenAI's DALL-E 2 AI system.

Prerequisites

  • Git
  • Docker Engine w/ Docker Compose
  • Java 8 or later

Getting Started

A more comprehensive guide to configuring Rapture will be created in the near future. For now, the following instructions should suffice.

  1. Fetch the repository: git clone https://github.com/ax-i-om/rapture.git
  2. Navigate into Rapture's root directory: cd rapture
  3. Run: docker compose build
  4. Run: docker compose up -d

The initialization process may take ~1min, but that's it! You can now navigate to the Rapture web page at http://localhost:6175.

Data Conversion

Below is an example of a valid/cleaned CSV file titled header.csv:

id,emails,passwords,firstName,lastName
ficticiousbreach-03122024-1,rapture@demo.com,VK9RSuK0wSSXNc0gF8iYW1f6,axiom,estimate
ficticiousbreach-03122024-2,rapture@demo.com,N7cblKU9ypU727lwiTr9espw,garble,vortex
ficticiousbreach-03122024-3,rapture@demo.com,eP0u9cM0jAa2QeUVI3d88rYn,vertiable,slap
ficticiousbreach-03122024-4,rapture@demo.com,QxtyRMAx3KniskzjGDg6tHdl,axiom,terrific
ficticiousbreach-03122024-5,rapture@demo.com,cirSMQZp7Enh98KLb6r8JT1I,garble,eighty
ficticiousbreach-03122024-6,rapture@demo.com,9J53HQEetTv5E2xCJKe4tdaP,veritable,tumble
ficticiousbreach-03122024-7,rapture@demo.com,3sZ7NPb54Fk0Qy2LXlLejwCu,axiom,slipper
ficticiousbreach-03122024-8,rapture@demo.com,OxoZGbn3v0tvBMyWA0Jds0Ea,garble,chord
ficticiousbreach-03122024-9,rapture@demo.com,et8ggkgUeyQ3ge7ua2YNsOLd,veritable,baffle

Only the fields specified during the execution of setup.sh are valid, and fields are case sensitive; therefore, firstName is a valid key/field, whereas firstname and FIRSTNAME are invalid. Likewise, the passwords field is valid; however, the singular iteration of the word (password) is invalid.

Note

Each data point is required to have an id field. It is up to you how to format this field; however, it is important that each and every occurrence of an id is unique. In the case of a duplicate id, data will be overwritten.

Data Importation

The process for importing data is straightforward. In the example below, we will use the Solr post tool to post the data from the header.csv file exhibited above.

Note

Supported file types: xml,json,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log

Note

The Solr post tool requires a Java runtime greater than version 8, and can be installed like so: sudo apt install openjdk-19-jre-headless

  1. Download Solr binary release from: https://solr.apache.org/downloads
  2. Extract binaries: tar zxvf solr-9.5.0.tgz
  3. Post data to Solr collection: /home/axiom/Desktop/solr-9.5.0/bin/post -c BigData -p 8983 -host 127.0.0.1 header.csv

Ensure that you change the command specified in step 3 to accomodate:

/home/axiom/Desktop/solr-9.5.0/bin/post - Path to Solr post binary extracted (extracted in step 2)

-c BigData - Collection name (Leave this as is, unless you modified setup.sh)

-p 8983 - Solr service port (Leave this as is, unless you modified setup.sh)

-host 127.0.0.1 - Solr service host address (Leave this as is, unless you modified setup.sh)

header.csv - File containing clean data you wish to post. This is the file we demonstrated in Data Conversion

After you have posted the data, you should be able to refresh the Rapture web page and successfully query the information!