mirror of
https://github.com/ail-project/ail-framework.git
synced 2024-11-30 01:37:17 +00:00
chg: [core] mv bin/packages/config.cfg configs/core.cfg + use ConfigLoader
This commit is contained in:
parent
3c6e424ac3
commit
c8d5ce9a28
46 changed files with 306 additions and 1159 deletions
4
.gitignore
vendored
4
.gitignore
vendored
|
@ -35,9 +35,9 @@ var/www/server.crt
|
||||||
var/www/server.key
|
var/www/server.key
|
||||||
|
|
||||||
# Local config
|
# Local config
|
||||||
bin/packages/config.cfg
|
|
||||||
bin/packages/config.cfg.backup
|
|
||||||
configs/keys
|
configs/keys
|
||||||
|
configs/core.cfg
|
||||||
|
configs/core.cfg.backup
|
||||||
configs/update.cfg
|
configs/update.cfg
|
||||||
update/current_version
|
update/current_version
|
||||||
files
|
files
|
||||||
|
|
9
HOWTO.md
9
HOWTO.md
|
@ -25,7 +25,7 @@ Feed data to AIL:
|
||||||
|
|
||||||
3. Launch pystemon ``` ./pystemon ```
|
3. Launch pystemon ``` ./pystemon ```
|
||||||
|
|
||||||
4. Edit your configuration file ```bin/packages/config.cfg``` and modify the pystemonpath path accordingly
|
4. Edit your configuration file ```configs/core.cfg``` and modify the pystemonpath path accordingly
|
||||||
|
|
||||||
5. Launch pystemon-feeder ``` ./bin/feeder/pystemon-feeder.py ```
|
5. Launch pystemon-feeder ``` ./bin/feeder/pystemon-feeder.py ```
|
||||||
|
|
||||||
|
@ -123,7 +123,7 @@ There are two types of installation. You can install a *local* or a *remote* Spl
|
||||||
(for a linux docker, the localhost IP is *172.17.0.1*; Should be adapted for other platform)
|
(for a linux docker, the localhost IP is *172.17.0.1*; Should be adapted for other platform)
|
||||||
- Restart the tor proxy: ``sudo service tor restart``
|
- Restart the tor proxy: ``sudo service tor restart``
|
||||||
|
|
||||||
3. *(AIL host)* Edit the ``/bin/packages/config.cfg`` file:
|
3. *(AIL host)* Edit the ``/configs/core.cfg`` file:
|
||||||
- In the crawler section, set ``activate_crawler`` to ``True``
|
- In the crawler section, set ``activate_crawler`` to ``True``
|
||||||
- Change the IP address of Splash servers if needed (remote only)
|
- Change the IP address of Splash servers if needed (remote only)
|
||||||
- Set ``splash_onion_port`` according to your Splash servers port numbers that will be used.
|
- Set ``splash_onion_port`` according to your Splash servers port numbers that will be used.
|
||||||
|
@ -134,7 +134,7 @@ There are two types of installation. You can install a *local* or a *remote* Spl
|
||||||
|
|
||||||
- *(Splash host)* Launch all Splash servers with:
|
- *(Splash host)* Launch all Splash servers with:
|
||||||
```sudo ./bin/torcrawler/launch_splash_crawler.sh -f <config absolute_path> -p <port_start> -n <number_of_splash>```
|
```sudo ./bin/torcrawler/launch_splash_crawler.sh -f <config absolute_path> -p <port_start> -n <number_of_splash>```
|
||||||
With ``<port_start>`` and ``<number_of_splash>`` matching those specified at ``splash_onion_port`` in the configuration file of point 3 (``/bin/packages/config.cfg``)
|
With ``<port_start>`` and ``<number_of_splash>`` matching those specified at ``splash_onion_port`` in the configuration file of point 3 (``/configs/core.cfg``)
|
||||||
|
|
||||||
All Splash dockers are launched inside the ``Docker_Splash`` screen. You can use ``sudo screen -r Docker_Splash`` to connect to the screen session and check all Splash servers status.
|
All Splash dockers are launched inside the ``Docker_Splash`` screen. You can use ``sudo screen -r Docker_Splash`` to connect to the screen session and check all Splash servers status.
|
||||||
|
|
||||||
|
@ -148,7 +148,7 @@ All Splash dockers are launched inside the ``Docker_Splash`` screen. You can use
|
||||||
- ```crawler_hidden_services_install.sh -y```
|
- ```crawler_hidden_services_install.sh -y```
|
||||||
- Add the following line in ``SOCKSPolicy accept 172.17.0.0/16`` in ``/etc/tor/torrc``
|
- Add the following line in ``SOCKSPolicy accept 172.17.0.0/16`` in ``/etc/tor/torrc``
|
||||||
- ```sudo service tor restart```
|
- ```sudo service tor restart```
|
||||||
- set activate_crawler to True in ``/bin/packages/config.cfg``
|
- set activate_crawler to True in ``/configs/core.cfg``
|
||||||
#### Start
|
#### Start
|
||||||
- ```sudo ./bin/torcrawler/launch_splash_crawler.sh -f $AIL_HOME/configs/docker/splash_onion/etc/splash/proxy-profiles/ -p 8050 -n 1```
|
- ```sudo ./bin/torcrawler/launch_splash_crawler.sh -f $AIL_HOME/configs/docker/splash_onion/etc/splash/proxy-profiles/ -p 8050 -n 1```
|
||||||
|
|
||||||
|
@ -166,4 +166,3 @@ Then starting the crawler service (if you follow the procedure above)
|
||||||
##### Python 3 Upgrade
|
##### Python 3 Upgrade
|
||||||
|
|
||||||
To upgrade from an existing AIL installation, you have to launch [python3_upgrade.sh](./python3_upgrade.sh), this script will delete and create a new virtual environment. The script **will upgrade the packages but won't keep your previous data** (neverthless the data is copied into a directory called `old`). If you install from scratch, you don't require to launch the [python3_upgrade.sh](./python3_upgrade.sh).
|
To upgrade from an existing AIL installation, you have to launch [python3_upgrade.sh](./python3_upgrade.sh), this script will delete and create a new virtual environment. The script **will upgrade the packages but won't keep your previous data** (neverthless the data is copied into a directory called `old`). If you install from scratch, you don't require to launch the [python3_upgrade.sh](./python3_upgrade.sh).
|
||||||
|
|
||||||
|
|
|
@ -20,10 +20,10 @@ import datetime
|
||||||
import json
|
import json
|
||||||
|
|
||||||
|
|
||||||
class PubSub(object):
|
class PubSub(object): ## TODO: remove config, use ConfigLoader by default
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
configfile = os.path.join(os.environ['AIL_HOME'], 'configs/core.cfg')
|
||||||
if not os.path.exists(configfile):
|
if not os.path.exists(configfile):
|
||||||
raise Exception('Unable to find the configuration file. \
|
raise Exception('Unable to find the configuration file. \
|
||||||
Did you set environment variables? \
|
Did you set environment variables? \
|
||||||
|
@ -111,7 +111,7 @@ class PubSub(object):
|
||||||
class Process(object):
|
class Process(object):
|
||||||
|
|
||||||
def __init__(self, conf_section, module=True):
|
def __init__(self, conf_section, module=True):
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
configfile = os.path.join(os.environ['AIL_HOME'], 'configs/core.cfg')
|
||||||
if not os.path.exists(configfile):
|
if not os.path.exists(configfile):
|
||||||
raise Exception('Unable to find the configuration file. \
|
raise Exception('Unable to find the configuration file. \
|
||||||
Did you set environment variables? \
|
Did you set environment variables? \
|
||||||
|
|
|
@ -218,7 +218,7 @@ function launching_scripts {
|
||||||
|
|
||||||
function launching_crawler {
|
function launching_crawler {
|
||||||
if [[ ! $iscrawler ]]; then
|
if [[ ! $iscrawler ]]; then
|
||||||
CONFIG=$AIL_BIN/packages/config.cfg
|
CONFIG=$AIL_HOME/configs/core.cfg
|
||||||
lport=$(awk '/^\[Crawler\]/{f=1} f==1&&/^splash_port/{print $3;exit}' "${CONFIG}")
|
lport=$(awk '/^\[Crawler\]/{f=1} f==1&&/^splash_port/{print $3;exit}' "${CONFIG}")
|
||||||
|
|
||||||
IFS='-' read -ra PORTS <<< "$lport"
|
IFS='-' read -ra PORTS <<< "$lport"
|
||||||
|
|
|
@ -8,20 +8,20 @@ module
|
||||||
This module send tagged pastes to MISP or THE HIVE Project
|
This module send tagged pastes to MISP or THE HIVE Project
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import redis
|
|
||||||
import sys
|
|
||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
|
import uuid
|
||||||
|
import redis
|
||||||
import time
|
import time
|
||||||
import json
|
import json
|
||||||
import configparser
|
|
||||||
|
|
||||||
from pubsublogger import publisher
|
from pubsublogger import publisher
|
||||||
from Helper import Process
|
from Helper import Process
|
||||||
from packages import Paste
|
from packages import Paste
|
||||||
import ailleakObject
|
import ailleakObject
|
||||||
|
|
||||||
import uuid
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
from pymisp import PyMISP
|
from pymisp import PyMISP
|
||||||
|
|
||||||
|
@ -133,26 +133,10 @@ if __name__ == "__main__":
|
||||||
|
|
||||||
config_section = 'MISP_The_hive_feeder'
|
config_section = 'MISP_The_hive_feeder'
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
r_serv_db = config_loader.get_redis_conn("ARDB_DB")
|
||||||
cfg.read(configfile)
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
|
|
||||||
r_serv_db = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
# set sensor uuid
|
# set sensor uuid
|
||||||
uuid_ail = r_serv_db.get('ail:uuid')
|
uuid_ail = r_serv_db.get('ail:uuid')
|
||||||
|
@ -212,7 +196,9 @@ if __name__ == "__main__":
|
||||||
|
|
||||||
refresh_time = 3
|
refresh_time = 3
|
||||||
## FIXME: remove it
|
## FIXME: remove it
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes"))
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes"))
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
time_1 = time.time()
|
time_1 = time.time()
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
|
|
39
bin/Mixer.py
39
bin/Mixer.py
|
@ -29,16 +29,20 @@ Every data coming from a named feed can be sent to a pre-processing module befor
|
||||||
The mapping can be done via the variable FEED_QUEUE_MAPPING
|
The mapping can be done via the variable FEED_QUEUE_MAPPING
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
import base64
|
import base64
|
||||||
import hashlib
|
import hashlib
|
||||||
import os
|
|
||||||
import time
|
import time
|
||||||
from pubsublogger import publisher
|
from pubsublogger import publisher
|
||||||
import redis
|
import redis
|
||||||
import configparser
|
|
||||||
|
|
||||||
from Helper import Process
|
from Helper import Process
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
|
|
||||||
# CONFIG #
|
# CONFIG #
|
||||||
refresh_time = 30
|
refresh_time = 30
|
||||||
|
@ -52,37 +56,22 @@ if __name__ == '__main__':
|
||||||
|
|
||||||
p = Process(config_section)
|
p = Process(config_section)
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
# REDIS #
|
# REDIS #
|
||||||
server = redis.StrictRedis(
|
server = config_loader.get_redis_conn("Redis_Mixer_Cache")
|
||||||
host=cfg.get("Redis_Mixer_Cache", "host"),
|
server_cache = config_loader.get_redis_conn("Redis_Log_submit")
|
||||||
port=cfg.getint("Redis_Mixer_Cache", "port"),
|
|
||||||
db=cfg.getint("Redis_Mixer_Cache", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
server_cache = redis.StrictRedis(
|
|
||||||
host=cfg.get("Redis_Log_submit", "host"),
|
|
||||||
port=cfg.getint("Redis_Log_submit", "port"),
|
|
||||||
db=cfg.getint("Redis_Log_submit", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
# LOGGING #
|
# LOGGING #
|
||||||
publisher.info("Feed Script started to receive & publish.")
|
publisher.info("Feed Script started to receive & publish.")
|
||||||
|
|
||||||
# OTHER CONFIG #
|
# OTHER CONFIG #
|
||||||
operation_mode = cfg.getint("Module_Mixer", "operation_mode")
|
operation_mode = config_loader.get_config_int("Module_Mixer", "operation_mode")
|
||||||
ttl_key = cfg.getint("Module_Mixer", "ttl_duplicate")
|
ttl_key = config_loader.get_config_int("Module_Mixer", "ttl_duplicate")
|
||||||
default_unnamed_feed_name = cfg.get("Module_Mixer", "default_unnamed_feed_name")
|
default_unnamed_feed_name = config_loader.get_config_str("Module_Mixer", "default_unnamed_feed_name")
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], p.config.get("Directories", "pastes")) + '/'
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
# STATS #
|
# STATS #
|
||||||
processed_paste = 0
|
processed_paste = 0
|
||||||
|
|
|
@ -1,354 +0,0 @@
|
||||||
#!/usr/bin/env python3
|
|
||||||
# -*-coding:UTF-8 -*
|
|
||||||
|
|
||||||
'''
|
|
||||||
|
|
||||||
This module can be use to see information of running modules.
|
|
||||||
These information are logged in "logs/moduleInfo.log"
|
|
||||||
|
|
||||||
It can also try to manage them by killing inactive one.
|
|
||||||
However, it does not support mutliple occurence of the same module
|
|
||||||
(It will kill the first one obtained by get)
|
|
||||||
|
|
||||||
|
|
||||||
'''
|
|
||||||
|
|
||||||
import time
|
|
||||||
import datetime
|
|
||||||
import redis
|
|
||||||
import os
|
|
||||||
import signal
|
|
||||||
import argparse
|
|
||||||
from subprocess import PIPE, Popen
|
|
||||||
import configparser
|
|
||||||
import json
|
|
||||||
from terminaltables import AsciiTable
|
|
||||||
import textwrap
|
|
||||||
from colorama import Fore, Back, Style, init
|
|
||||||
import curses
|
|
||||||
|
|
||||||
# CONFIG VARIABLES
|
|
||||||
kill_retry_threshold = 60 #1m
|
|
||||||
log_filename = "../logs/moduleInfo.log"
|
|
||||||
command_search_pid = "ps a -o pid,cmd | grep {}"
|
|
||||||
command_search_name = "ps a -o pid,cmd | grep {}"
|
|
||||||
command_restart_module = "screen -S \"Script\" -X screen -t \"{}\" bash -c \"./{}.py; read x\""
|
|
||||||
|
|
||||||
init() #Necesary for colorama
|
|
||||||
printarrayGlob = [None]*14
|
|
||||||
printarrayGlob.insert(0, ["Time", "Module", "PID", "Action"])
|
|
||||||
lastTimeKillCommand = {}
|
|
||||||
|
|
||||||
#Curses init
|
|
||||||
#stdscr = curses.initscr()
|
|
||||||
#curses.cbreak()
|
|
||||||
#stdscr.keypad(1)
|
|
||||||
|
|
||||||
# GLOBAL
|
|
||||||
last_refresh = 0
|
|
||||||
|
|
||||||
|
|
||||||
def getPid(module):
|
|
||||||
p = Popen([command_search_pid.format(module+".py")], stdin=PIPE, stdout=PIPE, bufsize=1, shell=True)
|
|
||||||
for line in p.stdout:
|
|
||||||
print(line)
|
|
||||||
splittedLine = line.split()
|
|
||||||
if 'python2' in splittedLine:
|
|
||||||
return int(splittedLine[0])
|
|
||||||
return None
|
|
||||||
|
|
||||||
def clearRedisModuleInfo():
|
|
||||||
for k in server.keys("MODULE_*"):
|
|
||||||
server.delete(k)
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, "*", "-", "Cleared redis module info"])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
|
|
||||||
def cleanRedis():
|
|
||||||
for k in server.keys("MODULE_TYPE_*"):
|
|
||||||
moduleName = k[12:].split('_')[0]
|
|
||||||
for pid in server.smembers(k):
|
|
||||||
flag_pid_valid = False
|
|
||||||
proc = Popen([command_search_name.format(pid)], stdin=PIPE, stdout=PIPE, bufsize=1, shell=True)
|
|
||||||
for line in proc.stdout:
|
|
||||||
splittedLine = line.split()
|
|
||||||
if ('python2' in splittedLine or 'python' in splittedLine) and "./"+moduleName+".py" in splittedLine:
|
|
||||||
flag_pid_valid = True
|
|
||||||
|
|
||||||
if not flag_pid_valid:
|
|
||||||
print(flag_pid_valid, 'cleaning', pid, 'in', k)
|
|
||||||
server.srem(k, pid)
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, moduleName, pid, "Cleared invalid pid in " + k])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
#time.sleep(5)
|
|
||||||
|
|
||||||
|
|
||||||
def kill_module(module, pid):
|
|
||||||
print('')
|
|
||||||
print('-> trying to kill module:', module)
|
|
||||||
|
|
||||||
if pid is None:
|
|
||||||
print('pid was None')
|
|
||||||
printarrayGlob.insert(1, [0, module, pid, "PID was None"])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
pid = getPid(module)
|
|
||||||
else: #Verify that the pid is at least in redis
|
|
||||||
if server.exists("MODULE_"+module+"_"+str(pid)) == 0:
|
|
||||||
return
|
|
||||||
|
|
||||||
lastTimeKillCommand[pid] = int(time.time())
|
|
||||||
if pid is not None:
|
|
||||||
try:
|
|
||||||
os.kill(pid, signal.SIGUSR1)
|
|
||||||
except OSError:
|
|
||||||
print(pid, 'already killed')
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, pid, "Already killed"])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
return
|
|
||||||
time.sleep(1)
|
|
||||||
if getPid(module) is None:
|
|
||||||
print(module, 'has been killed')
|
|
||||||
print('restarting', module, '...')
|
|
||||||
p2 = Popen([command_restart_module.format(module, module)], stdin=PIPE, stdout=PIPE, bufsize=1, shell=True)
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, pid, "Killed"])
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, "?", "Restarted"])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
printarrayGlob.pop()
|
|
||||||
|
|
||||||
else:
|
|
||||||
print('killing failed, retrying...')
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, pid, "Killing #1 failed."])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
|
|
||||||
time.sleep(1)
|
|
||||||
os.kill(pid, signal.SIGUSR1)
|
|
||||||
time.sleep(1)
|
|
||||||
if getPid(module) is None:
|
|
||||||
print(module, 'has been killed')
|
|
||||||
print('restarting', module, '...')
|
|
||||||
p2 = Popen([command_restart_module.format(module, module)], stdin=PIPE, stdout=PIPE, bufsize=1, shell=True)
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, pid, "Killed"])
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, "?", "Restarted"])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
printarrayGlob.pop()
|
|
||||||
else:
|
|
||||||
print('killing failed!')
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, pid, "Killing failed!"])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
else:
|
|
||||||
print('Module does not exist')
|
|
||||||
inst_time = datetime.datetime.fromtimestamp(int(time.time()))
|
|
||||||
printarrayGlob.insert(1, [inst_time, module, pid, "Killing failed, module not found"])
|
|
||||||
printarrayGlob.pop()
|
|
||||||
#time.sleep(5)
|
|
||||||
cleanRedis()
|
|
||||||
|
|
||||||
def get_color(time, idle):
|
|
||||||
if time is not None:
|
|
||||||
temp = time.split(':')
|
|
||||||
time = int(temp[0])*3600 + int(temp[1])*60 + int(temp[2])
|
|
||||||
|
|
||||||
if time >= args.treshold:
|
|
||||||
if not idle:
|
|
||||||
return Back.RED + Style.BRIGHT
|
|
||||||
else:
|
|
||||||
return Back.MAGENTA + Style.BRIGHT
|
|
||||||
elif time > args.treshold/2:
|
|
||||||
return Back.YELLOW + Style.BRIGHT
|
|
||||||
else:
|
|
||||||
return Back.GREEN + Style.BRIGHT
|
|
||||||
else:
|
|
||||||
return Style.RESET_ALL
|
|
||||||
|
|
||||||
def waiting_refresh():
|
|
||||||
global last_refresh
|
|
||||||
if time.time() - last_refresh < args.refresh:
|
|
||||||
return False
|
|
||||||
else:
|
|
||||||
last_refresh = time.time()
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
|
|
||||||
parser = argparse.ArgumentParser(description='Show info concerning running modules and log suspected stucked modules. May be use to automatically kill and restart stucked one.')
|
|
||||||
parser.add_argument('-r', '--refresh', type=int, required=False, default=1, help='Refresh rate')
|
|
||||||
parser.add_argument('-t', '--treshold', type=int, required=False, default=60*10*1, help='Refresh rate')
|
|
||||||
parser.add_argument('-k', '--autokill', type=int, required=False, default=0, help='Enable auto kill option (1 for TRUE, anything else for FALSE)')
|
|
||||||
parser.add_argument('-c', '--clear', type=int, required=False, default=0, help='Clear the current module information (Used to clear data from old launched modules)')
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
# REDIS #
|
|
||||||
server = redis.StrictRedis(
|
|
||||||
host=cfg.get("Redis_Queues", "host"),
|
|
||||||
port=cfg.getint("Redis_Queues", "port"),
|
|
||||||
db=cfg.getint("Redis_Queues", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
if args.clear == 1:
|
|
||||||
clearRedisModuleInfo()
|
|
||||||
|
|
||||||
lastTime = datetime.datetime.now()
|
|
||||||
|
|
||||||
module_file_array = set()
|
|
||||||
no_info_modules = {}
|
|
||||||
path_allmod = os.path.join(os.environ['AIL_HOME'], 'doc/all_modules.txt')
|
|
||||||
with open(path_allmod, 'r') as module_file:
|
|
||||||
for line in module_file:
|
|
||||||
module_file_array.add(line[:-1])
|
|
||||||
|
|
||||||
cleanRedis()
|
|
||||||
|
|
||||||
while True:
|
|
||||||
if waiting_refresh():
|
|
||||||
|
|
||||||
#key = ''
|
|
||||||
#while key != 'q':
|
|
||||||
# key = stdsrc.getch()
|
|
||||||
# stdscr.refresh()
|
|
||||||
|
|
||||||
all_queue = set()
|
|
||||||
printarray1 = []
|
|
||||||
printarray2 = []
|
|
||||||
printarray3 = []
|
|
||||||
for queue, card in server.hgetall("queues").items():
|
|
||||||
all_queue.add(queue)
|
|
||||||
key = "MODULE_" + queue + "_"
|
|
||||||
keySet = "MODULE_TYPE_" + queue
|
|
||||||
array_module_type = []
|
|
||||||
|
|
||||||
for moduleNum in server.smembers(keySet):
|
|
||||||
value = server.get(key + str(moduleNum))
|
|
||||||
if value is not None:
|
|
||||||
timestamp, path = value.split(", ")
|
|
||||||
if timestamp is not None and path is not None:
|
|
||||||
startTime_readable = datetime.datetime.fromtimestamp(int(timestamp))
|
|
||||||
processed_time_readable = str((datetime.datetime.now() - startTime_readable)).split('.')[0]
|
|
||||||
|
|
||||||
if int(card) > 0:
|
|
||||||
if int((datetime.datetime.now() - startTime_readable).total_seconds()) > args.treshold:
|
|
||||||
log = open(log_filename, 'a')
|
|
||||||
log.write(json.dumps([queue, card, str(startTime_readable), str(processed_time_readable), path]) + "\n")
|
|
||||||
try:
|
|
||||||
last_kill_try = time.time() - lastTimeKillCommand[moduleNum]
|
|
||||||
except KeyError:
|
|
||||||
last_kill_try = kill_retry_threshold+1
|
|
||||||
if args.autokill == 1 and last_kill_try > kill_retry_threshold :
|
|
||||||
kill_module(queue, int(moduleNum))
|
|
||||||
|
|
||||||
array_module_type.append([get_color(processed_time_readable, False) + str(queue), str(moduleNum), str(card), str(startTime_readable), str(processed_time_readable), str(path) + get_color(None, False)])
|
|
||||||
|
|
||||||
else:
|
|
||||||
printarray2.append([get_color(processed_time_readable, True) + str(queue), str(moduleNum), str(card), str(startTime_readable), str(processed_time_readable), str(path) + get_color(None, True)])
|
|
||||||
array_module_type.sort(lambda x,y: cmp(x[4], y[4]), reverse=True)
|
|
||||||
for e in array_module_type:
|
|
||||||
printarray1.append(e)
|
|
||||||
|
|
||||||
for curr_queue in module_file_array:
|
|
||||||
if curr_queue not in all_queue:
|
|
||||||
printarray3.append([curr_queue, "Not running"])
|
|
||||||
else:
|
|
||||||
if len(list(server.smembers('MODULE_TYPE_'+curr_queue))) == 0:
|
|
||||||
if curr_queue not in no_info_modules:
|
|
||||||
no_info_modules[curr_queue] = int(time.time())
|
|
||||||
printarray3.append([curr_queue, "No data"])
|
|
||||||
else:
|
|
||||||
#If no info since long time, try to kill
|
|
||||||
if args.autokill == 1:
|
|
||||||
if int(time.time()) - no_info_modules[curr_queue] > args.treshold:
|
|
||||||
kill_module(curr_queue, None)
|
|
||||||
no_info_modules[curr_queue] = int(time.time())
|
|
||||||
printarray3.append([curr_queue, "Stuck or idle, restarting in " + str(abs(args.treshold - (int(time.time()) - no_info_modules[curr_queue]))) + "s"])
|
|
||||||
else:
|
|
||||||
printarray3.append([curr_queue, "Stuck or idle, restarting disabled"])
|
|
||||||
|
|
||||||
## FIXME To add:
|
|
||||||
## Button KILL Process using Curses
|
|
||||||
|
|
||||||
printarray1.sort(key=lambda x: x[0][9:], reverse=False)
|
|
||||||
printarray2.sort(key=lambda x: x[0][9:], reverse=False)
|
|
||||||
printarray1.insert(0,["Queue", "PID", "Amount", "Paste start time", "Processing time for current paste (H:M:S)", "Paste hash"])
|
|
||||||
printarray2.insert(0,["Queue", "PID","Amount", "Paste start time", "Time since idle (H:M:S)", "Last paste hash"])
|
|
||||||
printarray3.insert(0,["Queue", "State"])
|
|
||||||
|
|
||||||
os.system('clear')
|
|
||||||
t1 = AsciiTable(printarray1, title="Working queues")
|
|
||||||
t1.column_max_width(1)
|
|
||||||
if not t1.ok:
|
|
||||||
longest_col = t1.column_widths.index(max(t1.column_widths))
|
|
||||||
max_length_col = t1.column_max_width(longest_col)
|
|
||||||
if max_length_col > 0:
|
|
||||||
for i, content in enumerate(t1.table_data):
|
|
||||||
if len(content[longest_col]) > max_length_col:
|
|
||||||
temp = ''
|
|
||||||
for l in content[longest_col].splitlines():
|
|
||||||
if len(l) > max_length_col:
|
|
||||||
temp += '\n'.join(textwrap.wrap(l, max_length_col)) + '\n'
|
|
||||||
else:
|
|
||||||
temp += l + '\n'
|
|
||||||
content[longest_col] = temp.strip()
|
|
||||||
t1.table_data[i] = content
|
|
||||||
|
|
||||||
t2 = AsciiTable(printarray2, title="Idling queues")
|
|
||||||
t2.column_max_width(1)
|
|
||||||
if not t2.ok:
|
|
||||||
longest_col = t2.column_widths.index(max(t2.column_widths))
|
|
||||||
max_length_col = t2.column_max_width(longest_col)
|
|
||||||
if max_length_col > 0:
|
|
||||||
for i, content in enumerate(t2.table_data):
|
|
||||||
if len(content[longest_col]) > max_length_col:
|
|
||||||
temp = ''
|
|
||||||
for l in content[longest_col].splitlines():
|
|
||||||
if len(l) > max_length_col:
|
|
||||||
temp += '\n'.join(textwrap.wrap(l, max_length_col)) + '\n'
|
|
||||||
else:
|
|
||||||
temp += l + '\n'
|
|
||||||
content[longest_col] = temp.strip()
|
|
||||||
t2.table_data[i] = content
|
|
||||||
|
|
||||||
t3 = AsciiTable(printarray3, title="Not running queues")
|
|
||||||
t3.column_max_width(1)
|
|
||||||
|
|
||||||
printarray4 = []
|
|
||||||
for elem in printarrayGlob:
|
|
||||||
if elem is not None:
|
|
||||||
printarray4.append(elem)
|
|
||||||
|
|
||||||
t4 = AsciiTable(printarray4, title="Last actions")
|
|
||||||
t4.column_max_width(1)
|
|
||||||
|
|
||||||
legend_array = [["Color", "Meaning"], [Back.RED+Style.BRIGHT+" "*10+Style.RESET_ALL, "Time >=" +str(args.treshold)+Style.RESET_ALL], [Back.MAGENTA+Style.BRIGHT+" "*10+Style.RESET_ALL, "Time >=" +str(args.treshold)+" while idle"+Style.RESET_ALL], [Back.YELLOW+Style.BRIGHT+" "*10+Style.RESET_ALL, "Time >=" +str(args.treshold/2)+Style.RESET_ALL], [Back.GREEN+Style.BRIGHT+" "*10+Style.RESET_ALL, "Time <" +str(args.treshold)]]
|
|
||||||
legend = AsciiTable(legend_array, title="Legend")
|
|
||||||
legend.column_max_width(1)
|
|
||||||
|
|
||||||
print(legend.table)
|
|
||||||
print('\n')
|
|
||||||
print(t1.table)
|
|
||||||
print('\n')
|
|
||||||
print(t2.table)
|
|
||||||
print('\n')
|
|
||||||
print(t3.table)
|
|
||||||
print('\n')
|
|
||||||
print(t4.table9)
|
|
||||||
|
|
||||||
if (datetime.datetime.now() - lastTime).total_seconds() > args.refresh*5:
|
|
||||||
lastTime = datetime.datetime.now()
|
|
||||||
cleanRedis()
|
|
||||||
#time.sleep(args.refresh)
|
|
|
@ -10,13 +10,16 @@ from asciimatics.event import Event
|
||||||
from asciimatics.event import KeyboardEvent, MouseEvent
|
from asciimatics.event import KeyboardEvent, MouseEvent
|
||||||
import sys, os
|
import sys, os
|
||||||
import time, datetime
|
import time, datetime
|
||||||
import argparse, configparser
|
import argparse
|
||||||
import json
|
import json
|
||||||
import redis
|
import redis
|
||||||
import psutil
|
import psutil
|
||||||
from subprocess import PIPE, Popen
|
from subprocess import PIPE, Popen
|
||||||
from packages import Paste
|
from packages import Paste
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
# CONFIG VARIABLES
|
# CONFIG VARIABLES
|
||||||
kill_retry_threshold = 60 #1m
|
kill_retry_threshold = 60 #1m
|
||||||
log_filename = "../logs/moduleInfo.log"
|
log_filename = "../logs/moduleInfo.log"
|
||||||
|
@ -798,21 +801,11 @@ if __name__ == "__main__":
|
||||||
|
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
# REDIS #
|
# REDIS #
|
||||||
server = redis.StrictRedis(
|
server = config_loader.get_redis_conn("Redis_Queues")
|
||||||
host=cfg.get("Redis_Queues", "host"),
|
config_loader = None
|
||||||
port=cfg.getint("Redis_Queues", "port"),
|
|
||||||
db=cfg.getint("Redis_Queues", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
if args.clear == 1:
|
if args.clear == 1:
|
||||||
clearRedisModuleInfo()
|
clearRedisModuleInfo()
|
||||||
|
|
|
@ -1,39 +1,34 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
# -*-coding:UTF-8 -*
|
# -*-coding:UTF-8 -*
|
||||||
|
|
||||||
import argparse
|
|
||||||
import configparser
|
|
||||||
import traceback
|
|
||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import traceback
|
||||||
import smtplib
|
import smtplib
|
||||||
from pubsublogger import publisher
|
from pubsublogger import publisher
|
||||||
from email.mime.multipart import MIMEMultipart
|
from email.mime.multipart import MIMEMultipart
|
||||||
from email.mime.text import MIMEText
|
from email.mime.text import MIMEText
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
"""
|
"""
|
||||||
This module allows the global configuration and management of notification settings and methods.
|
This module allows the global configuration and management of notification settings and methods.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# CONFIG #
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
|
||||||
|
|
||||||
publisher.port = 6380
|
publisher.port = 6380
|
||||||
publisher.channel = "Script"
|
publisher.channel = "Script"
|
||||||
|
|
||||||
def sendEmailNotification(recipient, alert_name, content):
|
def sendEmailNotification(recipient, alert_name, content):
|
||||||
|
|
||||||
if not os.path.exists(configfile):
|
sender = config_loader.get_config_str("Notifications", "sender")
|
||||||
raise Exception('Unable to find the configuration file. \
|
sender_host = config_loader.get_config_str("Notifications", "sender_host")
|
||||||
Did you set environment variables? \
|
sender_port = config_loader.get_config_int("Notifications", "sender_port")
|
||||||
Or activate the virtualenv?')
|
sender_pw = config_loader.get_config_str("Notifications", "sender_pw")
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
sender = cfg.get("Notifications", "sender")
|
|
||||||
sender_host = cfg.get("Notifications", "sender_host")
|
|
||||||
sender_port = cfg.getint("Notifications", "sender_port")
|
|
||||||
sender_pw = cfg.get("Notifications", "sender_pw")
|
|
||||||
if sender_pw == 'None':
|
if sender_pw == 'None':
|
||||||
sender_pw = None
|
sender_pw = None
|
||||||
|
|
||||||
|
|
|
@ -1,57 +0,0 @@
|
||||||
#!/usr/bin/env python3
|
|
||||||
# -*-coding:UTF-8 -*
|
|
||||||
|
|
||||||
import redis
|
|
||||||
import argparse
|
|
||||||
import configparser
|
|
||||||
import time
|
|
||||||
import os
|
|
||||||
from pubsublogger import publisher
|
|
||||||
import texttable
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
"""Main Function"""
|
|
||||||
|
|
||||||
# CONFIG #
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read('./packages/config.cfg')
|
|
||||||
|
|
||||||
# SCRIPT PARSER #
|
|
||||||
parser = argparse.ArgumentParser(
|
|
||||||
description='''This script is a part of the Assisted Information Leak framework.''',
|
|
||||||
epilog='''''')
|
|
||||||
|
|
||||||
parser.add_argument('-db', type=int, default=0,
|
|
||||||
help='The name of the Redis DB (default 0)',
|
|
||||||
choices=[0, 1, 2, 3, 4], action='store')
|
|
||||||
|
|
||||||
# REDIS #
|
|
||||||
r_serv = redis.StrictRedis(
|
|
||||||
host=cfg.get("Redis_Queues", "host"),
|
|
||||||
port=cfg.getint("Redis_Queues", "port"),
|
|
||||||
db=cfg.getint("Redis_Queues", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
# LOGGING #
|
|
||||||
publisher.port = 6380
|
|
||||||
publisher.channel = "Queuing"
|
|
||||||
|
|
||||||
while True:
|
|
||||||
table = texttable.Texttable()
|
|
||||||
table.header(["Queue name", "#Items"])
|
|
||||||
row = []
|
|
||||||
for queue in r_serv.smembers("queues"):
|
|
||||||
current = r_serv.llen(queue)
|
|
||||||
current = current - r_serv.llen(queue)
|
|
||||||
row.append((queue, r_serv.llen(queue)))
|
|
||||||
|
|
||||||
time.sleep(0.5)
|
|
||||||
row.sort()
|
|
||||||
table.add_rows(row, header=False)
|
|
||||||
os.system('clear')
|
|
||||||
print(table.draw())
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
|
@ -1,97 +0,0 @@
|
||||||
#!/usr/bin/python3
|
|
||||||
# -*-coding:UTF-8 -*
|
|
||||||
|
|
||||||
import redis
|
|
||||||
import argparse
|
|
||||||
import configparser
|
|
||||||
from datetime import datetime
|
|
||||||
from pubsublogger import publisher
|
|
||||||
|
|
||||||
import matplotlib.pyplot as plt
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
"""Main Function"""
|
|
||||||
|
|
||||||
# CONFIG #
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read('./packages/config.cfg')
|
|
||||||
|
|
||||||
# SCRIPT PARSER #
|
|
||||||
parser = argparse.ArgumentParser(
|
|
||||||
description='''This script is a part of the Analysis Information Leak framework.''',
|
|
||||||
epilog='''''')
|
|
||||||
|
|
||||||
parser.add_argument('-f', type=str, metavar="filename", default="figure",
|
|
||||||
help='The absolute path name of the "figure.png"',
|
|
||||||
action='store')
|
|
||||||
parser.add_argument('-y', '--year', type=int, required=False, default=None, help='The date related to the DB')
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
# REDIS #
|
|
||||||
# port generated automatically depending on the date
|
|
||||||
curYear = datetime.now().year if args.year is None else args.year
|
|
||||||
r_serv = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Hashs", "host"),
|
|
||||||
port=cfg.getint("ARDB_Hashs", "port"),
|
|
||||||
db=curYear,
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
# LOGGING #
|
|
||||||
publisher.port = 6380
|
|
||||||
publisher.channel = "Graph"
|
|
||||||
|
|
||||||
# FUNCTIONS #
|
|
||||||
publisher.info("""Creating the Repartition Graph""")
|
|
||||||
|
|
||||||
total_list = []
|
|
||||||
codepad_list = []
|
|
||||||
pastie_list = []
|
|
||||||
pastebin_list = []
|
|
||||||
for hash in r_serv.keys():
|
|
||||||
total_list.append(r_serv.scard(hash))
|
|
||||||
|
|
||||||
code = 0
|
|
||||||
pastie = 0
|
|
||||||
pastebin = 0
|
|
||||||
for paste in r_serv.smembers(hash):
|
|
||||||
source = paste.split("/")[5]
|
|
||||||
|
|
||||||
if source == "codepad.org":
|
|
||||||
code = code + 1
|
|
||||||
elif source == "pastie.org":
|
|
||||||
pastie = pastie + 1
|
|
||||||
elif source == "pastebin.com":
|
|
||||||
pastebin = pastebin + 1
|
|
||||||
|
|
||||||
codepad_list.append(code)
|
|
||||||
pastie_list.append(pastie)
|
|
||||||
pastebin_list.append(pastebin)
|
|
||||||
|
|
||||||
codepad_list.sort(reverse=True)
|
|
||||||
pastie_list.sort(reverse=True)
|
|
||||||
pastebin_list.sort(reverse=True)
|
|
||||||
|
|
||||||
total_list.sort(reverse=True)
|
|
||||||
|
|
||||||
plt.plot(codepad_list, 'b', label='Codepad.org')
|
|
||||||
plt.plot(pastebin_list, 'g', label='Pastebin.org')
|
|
||||||
plt.plot(pastie_list, 'y', label='Pastie.org')
|
|
||||||
plt.plot(total_list, 'r', label='Total')
|
|
||||||
|
|
||||||
plt.xscale('log')
|
|
||||||
plt.xlabel('Hashs')
|
|
||||||
plt.ylabel('Occur[Hash]')
|
|
||||||
plt.title('Repartition')
|
|
||||||
plt.legend()
|
|
||||||
plt.grid()
|
|
||||||
plt.tight_layout()
|
|
||||||
|
|
||||||
plt.savefig(args.f+".png", dpi=None, facecolor='w', edgecolor='b',
|
|
||||||
orientation='portrait', papertype=None, format="png",
|
|
||||||
transparent=False, bbox_inches=None, pad_inches=0.1,
|
|
||||||
frameon=True)
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
|
@ -14,6 +14,8 @@
|
||||||
Hutto, C.J. & Gilbert, E.E. (2014). VADER: A Parsimonious Rule-based Model for Sentiment Analysis of Social Media Text. Eighth International Conference on Weblogs and Social Media (ICWSM-14). Ann Arbor, MI, June 2014.
|
Hutto, C.J. & Gilbert, E.E. (2014). VADER: A Parsimonious Rule-based Model for Sentiment Analysis of Social Media Text. Eighth International Conference on Weblogs and Social Media (ICWSM-14). Ann Arbor, MI, June 2014.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
import time
|
import time
|
||||||
import datetime
|
import datetime
|
||||||
|
@ -24,6 +26,9 @@ from pubsublogger import publisher
|
||||||
from Helper import Process
|
from Helper import Process
|
||||||
from packages import Paste
|
from packages import Paste
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
from nltk.sentiment.vader import SentimentIntensityAnalyzer
|
from nltk.sentiment.vader import SentimentIntensityAnalyzer
|
||||||
from nltk import tokenize
|
from nltk import tokenize
|
||||||
|
|
||||||
|
@ -32,19 +37,6 @@ accepted_Mime_type = ['text/plain']
|
||||||
size_threshold = 250
|
size_threshold = 250
|
||||||
line_max_length_threshold = 1000
|
line_max_length_threshold = 1000
|
||||||
|
|
||||||
import os
|
|
||||||
import configparser
|
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
sentiment_lexicon_file = cfg.get("Directories", "sentiment_lexicon_file")
|
|
||||||
#time_clean_sentiment_db = 60*60
|
#time_clean_sentiment_db = 60*60
|
||||||
|
|
||||||
def Analyse(message, server):
|
def Analyse(message, server):
|
||||||
|
@ -151,12 +143,12 @@ if __name__ == '__main__':
|
||||||
# Sent to the logging a description of the module
|
# Sent to the logging a description of the module
|
||||||
publisher.info("<description of the module>")
|
publisher.info("<description of the module>")
|
||||||
|
|
||||||
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
|
sentiment_lexicon_file = config_loader.get_config_str("Directories", "sentiment_lexicon_file")
|
||||||
|
|
||||||
# REDIS_LEVEL_DB #
|
# REDIS_LEVEL_DB #
|
||||||
server = redis.StrictRedis(
|
server = config_loader.get_redis_conn("ARDB_Sentiment")
|
||||||
host=p.config.get("ARDB_Sentiment", "host"),
|
config_loader = None
|
||||||
port=p.config.get("ARDB_Sentiment", "port"),
|
|
||||||
db=p.config.get("ARDB_Sentiment", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
time1 = time.time()
|
time1 = time.time()
|
||||||
|
|
||||||
|
|
|
@ -1,68 +0,0 @@
|
||||||
#!/usr/bin/env python3
|
|
||||||
# -*-coding:UTF-8 -*
|
|
||||||
"""
|
|
||||||
The ZMQ_Feed_Q Module
|
|
||||||
=====================
|
|
||||||
|
|
||||||
This module is consuming the Redis-list created by the ZMQ_Feed_Q Module,
|
|
||||||
And save the paste on disk to allow others modules to work on them.
|
|
||||||
|
|
||||||
..todo:: Be able to choose to delete or not the saved paste after processing.
|
|
||||||
..todo:: Store the empty paste (unprocessed) somewhere in Redis.
|
|
||||||
|
|
||||||
..note:: Module ZMQ_Something_Q and ZMQ_Something are closely bound, always put
|
|
||||||
the same Subscriber name in both of them.
|
|
||||||
|
|
||||||
Requirements
|
|
||||||
------------
|
|
||||||
|
|
||||||
*Need running Redis instances.
|
|
||||||
*Need the ZMQ_Feed_Q Module running to be able to work properly.
|
|
||||||
|
|
||||||
"""
|
|
||||||
import redis
|
|
||||||
import configparser
|
|
||||||
import os
|
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], './packages/config.cfg')
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
"""Main Function"""
|
|
||||||
|
|
||||||
# CONFIG #
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
# REDIS
|
|
||||||
r_serv = redis.StrictRedis(host=cfg.get("Redis_Queues", "host"),
|
|
||||||
port=cfg.getint("Redis_Queues", "port"),
|
|
||||||
db=cfg.getint("Redis_Queues", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
# FIXME: automatic based on the queue name.
|
|
||||||
# ### SCRIPTS ####
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Feed")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Categ")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Lines")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Tokenize")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Attributes")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Creditcards")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Duplicate")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Mails")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Onion")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Urls")
|
|
||||||
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Feed_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Categ_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Lines_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Tokenize_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Attributes_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Creditcards_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Duplicate_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Mails_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Onion_Q")
|
|
||||||
r_serv.sadd("SHUTDOWN_FLAGS", "Urls_Q")
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
|
@ -18,9 +18,6 @@ import NotificationHelper
|
||||||
from packages import Item
|
from packages import Item
|
||||||
from packages import Term
|
from packages import Term
|
||||||
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_FLASK'], 'modules'))
|
|
||||||
import Flask_config
|
|
||||||
|
|
||||||
full_item_url = "/showsavedpaste/?paste="
|
full_item_url = "/showsavedpaste/?paste="
|
||||||
|
|
||||||
mail_body_template = "AIL Framework,\nNew occurrence for term tracked term: {}\nitem id: {}\nurl: {}{}"
|
mail_body_template = "AIL Framework,\nNew occurrence for term tracked term: {}\nitem id: {}\nurl: {}{}"
|
||||||
|
|
|
@ -68,9 +68,9 @@ def main():
|
||||||
|
|
||||||
#------------------------------------------------------------------------------------#
|
#------------------------------------------------------------------------------------#
|
||||||
|
|
||||||
config_file_default = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_file_default = os.path.join(os.environ['AIL_HOME'], 'configs/core.cfg')
|
||||||
config_file_default_sample = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg.sample')
|
config_file_default_sample = os.path.join(os.environ['AIL_HOME'], 'configs/core.cfg.sample')
|
||||||
config_file_default_backup = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg.backup')
|
config_file_default_backup = os.path.join(os.environ['AIL_HOME'], 'configs/core.cfg.backup')
|
||||||
|
|
||||||
config_file_update = os.path.join(os.environ['AIL_HOME'], 'configs/update.cfg')
|
config_file_update = os.path.join(os.environ['AIL_HOME'], 'configs/update.cfg')
|
||||||
config_file_update_sample = os.path.join(os.environ['AIL_HOME'], 'configs/update.cfg.sample')
|
config_file_update_sample = os.path.join(os.environ['AIL_HOME'], 'configs/update.cfg.sample')
|
||||||
|
|
|
@ -1,13 +1,18 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
# -*-coding:UTF-8 -*
|
# -*-coding:UTF-8 -*
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
from pymisp.tools.abstractgenerator import AbstractMISPObjectGenerator
|
from pymisp.tools.abstractgenerator import AbstractMISPObjectGenerator
|
||||||
import configparser
|
|
||||||
from packages import Paste
|
from packages import Paste
|
||||||
import datetime
|
import datetime
|
||||||
import json
|
import json
|
||||||
from io import BytesIO
|
from io import BytesIO
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
class AilLeakObject(AbstractMISPObjectGenerator):
|
class AilLeakObject(AbstractMISPObjectGenerator):
|
||||||
def __init__(self, uuid_ail, p_source, p_date, p_content, p_duplicate, p_duplicate_number):
|
def __init__(self, uuid_ail, p_source, p_date, p_content, p_duplicate, p_duplicate_number):
|
||||||
super(AbstractMISPObjectGenerator, self).__init__('ail-leak')
|
super(AbstractMISPObjectGenerator, self).__init__('ail-leak')
|
||||||
|
@ -35,9 +40,9 @@ class ObjectWrapper:
|
||||||
self.pymisp = pymisp
|
self.pymisp = pymisp
|
||||||
self.currentID_date = None
|
self.currentID_date = None
|
||||||
self.eventID_to_push = self.get_daily_event_id()
|
self.eventID_to_push = self.get_daily_event_id()
|
||||||
cfg = configparser.ConfigParser()
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
cfg.read('./packages/config.cfg')
|
self.maxDuplicateToPushToMISP = config_loader.get_config_int("ailleakObject", "maxDuplicateToPushToMISP")
|
||||||
self.maxDuplicateToPushToMISP = cfg.getint("ailleakObject", "maxDuplicateToPushToMISP")
|
config_loader = None
|
||||||
self.attribute_to_tag = None
|
self.attribute_to_tag = None
|
||||||
|
|
||||||
def add_new_object(self, uuid_ail, path, p_source, tag):
|
def add_new_object(self, uuid_ail, path, p_source, tag):
|
||||||
|
|
|
@ -17,36 +17,33 @@
|
||||||
#
|
#
|
||||||
# Copyright (c) 2014 Alexandre Dulaunoy - a@foo.be
|
# Copyright (c) 2014 Alexandre Dulaunoy - a@foo.be
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
import zmq
|
import zmq
|
||||||
import random
|
import random
|
||||||
import sys
|
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import base64
|
import base64
|
||||||
import os
|
|
||||||
import configparser
|
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
if not os.path.exists(configfile):
|
import ConfigLoader
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
if cfg.has_option("ZMQ_Global", "bind"):
|
if config_loader.has_option("ZMQ_Global", "bind"):
|
||||||
zmq_url = cfg.get("ZMQ_Global", "bind")
|
zmq_url = config_loader.get_config_str("ZMQ_Global", "bind")
|
||||||
else:
|
else:
|
||||||
zmq_url = "tcp://127.0.0.1:5556"
|
zmq_url = "tcp://127.0.0.1:5556"
|
||||||
|
|
||||||
pystemonpath = cfg.get("Directories", "pystemonpath")
|
pystemonpath = config_loader.get_config_str("Directories", "pystemonpath")
|
||||||
pastes_directory = cfg.get("Directories", "pastes")
|
pastes_directory = config_loader.get_config_str("Directories", "pastes")
|
||||||
pastes_directory = os.path.join(os.environ['AIL_HOME'], pastes_directory)
|
pastes_directory = os.path.join(os.environ['AIL_HOME'], pastes_directory)
|
||||||
base_sleeptime = 0.01
|
base_sleeptime = 0.01
|
||||||
sleep_inc = 0
|
sleep_inc = 0
|
||||||
|
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
context = zmq.Context()
|
context = zmq.Context()
|
||||||
socket = context.socket(zmq.PUB)
|
socket = context.socket(zmq.PUB)
|
||||||
socket.bind(zmq_url)
|
socket.bind(zmq_url)
|
||||||
|
|
|
@ -10,11 +10,13 @@
|
||||||
#
|
#
|
||||||
# Copyright (c) 2014 Alexandre Dulaunoy - a@foo.be
|
# Copyright (c) 2014 Alexandre Dulaunoy - a@foo.be
|
||||||
|
|
||||||
import configparser
|
|
||||||
import argparse
|
import argparse
|
||||||
import gzip
|
import gzip
|
||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
def readdoc(path=None):
|
def readdoc(path=None):
|
||||||
if path is None:
|
if path is None:
|
||||||
|
@ -22,13 +24,11 @@ def readdoc(path=None):
|
||||||
f = gzip.open(path, 'r')
|
f = gzip.open(path, 'r')
|
||||||
return f.read()
|
return f.read()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
# Indexer configuration - index dir and schema setup
|
# Indexer configuration - index dir and schema setup
|
||||||
indexpath = os.path.join(os.environ['AIL_HOME'], cfg.get("Indexer", "path"))
|
indexpath = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Indexer", "path"))
|
||||||
indexertype = cfg.get("Indexer", "type")
|
indexertype = config_loader.get_config_str("Indexer", "type")
|
||||||
|
|
||||||
argParser = argparse.ArgumentParser(description='Fulltext search for AIL')
|
argParser = argparse.ArgumentParser(description='Fulltext search for AIL')
|
||||||
argParser.add_argument('-q', action='append', help='query to lookup (one or more)')
|
argParser.add_argument('-q', action='append', help='query to lookup (one or more)')
|
||||||
|
|
|
@ -46,3 +46,9 @@ class ConfigLoader(object):
|
||||||
|
|
||||||
def get_config_boolean(self, section, key_name):
|
def get_config_boolean(self, section, key_name):
|
||||||
return self.cfg.getboolean(section, key_name)
|
return self.cfg.getboolean(section, key_name)
|
||||||
|
|
||||||
|
def has_option(self, section, key_name):
|
||||||
|
return self.cfg.has_option(section, key_name)
|
||||||
|
|
||||||
|
def has_section(self, section):
|
||||||
|
return self.cfg.has_section(section)
|
||||||
|
|
|
@ -5,7 +5,7 @@ import os
|
||||||
import sys
|
import sys
|
||||||
import redis
|
import redis
|
||||||
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_FLASK'], 'lib/'))
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
import ConfigLoader
|
import ConfigLoader
|
||||||
|
|
||||||
config_loader = ConfigLoader.ConfigLoader()
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
|
|
|
@ -17,6 +17,7 @@ Conditions to fulfill to be able to use this class correctly:
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
import time
|
import time
|
||||||
import gzip
|
import gzip
|
||||||
import redis
|
import redis
|
||||||
|
@ -25,11 +26,12 @@ import random
|
||||||
from io import BytesIO
|
from io import BytesIO
|
||||||
import zipfile
|
import zipfile
|
||||||
|
|
||||||
import configparser
|
|
||||||
import sys
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages/'))
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages/'))
|
||||||
from Date import Date
|
from Date import Date
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
class HiddenServices(object):
|
class HiddenServices(object):
|
||||||
"""
|
"""
|
||||||
This class representing a hiddenServices as an object.
|
This class representing a hiddenServices as an object.
|
||||||
|
@ -43,27 +45,11 @@ class HiddenServices(object):
|
||||||
|
|
||||||
def __init__(self, domain, type, port=80):
|
def __init__(self, domain, type, port=80):
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
self.r_serv_onion = config_loader.get_redis_conn("ARDB_Onion")
|
||||||
raise Exception('Unable to find the configuration file. \
|
self.r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
self.PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
cfg.read(configfile)
|
|
||||||
self.r_serv_onion = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Onion", "host"),
|
|
||||||
port=cfg.getint("ARDB_Onion", "port"),
|
|
||||||
db=cfg.getint("ARDB_Onion", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
self.r_serv_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
self.PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
|
||||||
|
|
||||||
self.domain = domain
|
self.domain = domain
|
||||||
self.type = type
|
self.type = type
|
||||||
|
@ -71,18 +57,20 @@ class HiddenServices(object):
|
||||||
self.tags = {}
|
self.tags = {}
|
||||||
|
|
||||||
if type == 'onion' or type == 'regular':
|
if type == 'onion' or type == 'regular':
|
||||||
self.paste_directory = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes"))
|
self.paste_directory = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes"))
|
||||||
self.paste_crawled_directory = os.path.join(self.paste_directory, cfg.get("Directories", "crawled"))
|
self.paste_crawled_directory = os.path.join(self.paste_directory, config_loader.get_config_str("Directories", "crawled"))
|
||||||
self.paste_crawled_directory_name = cfg.get("Directories", "crawled")
|
self.paste_crawled_directory_name = config_loader.get_config_str("Directories", "crawled")
|
||||||
self.screenshot_directory = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "crawled_screenshot"))
|
self.screenshot_directory = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "crawled_screenshot"))
|
||||||
self.screenshot_directory_screenshot = os.path.join(self.screenshot_directory, 'screenshot')
|
self.screenshot_directory_screenshot = os.path.join(self.screenshot_directory, 'screenshot')
|
||||||
elif type == 'i2p':
|
elif type == 'i2p':
|
||||||
self.paste_directory = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "crawled_screenshot"))
|
self.paste_directory = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "crawled_screenshot"))
|
||||||
self.screenshot_directory = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "crawled_screenshot"))
|
self.screenshot_directory = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "crawled_screenshot"))
|
||||||
else:
|
else:
|
||||||
## TODO: # FIXME: add error
|
## TODO: # FIXME: add error
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
#def remove_absolute_path_link(self, key, value):
|
#def remove_absolute_path_link(self, key, value):
|
||||||
# print(key)
|
# print(key)
|
||||||
# print(value)
|
# print(value)
|
||||||
|
|
|
@ -2,13 +2,17 @@
|
||||||
# -*-coding:UTF-8 -*
|
# -*-coding:UTF-8 -*
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
import uuid
|
import uuid
|
||||||
import redis
|
import redis
|
||||||
|
|
||||||
import Flask_config
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
r_serv_db = Flask_config.r_serv_db
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
r_serv_log_submit = Flask_config.r_serv_log_submit
|
r_serv_db = config_loader.get_redis_conn("ARDB_DB")
|
||||||
|
r_serv_log_submit = config_loader.get_redis_conn("Redis_Log_submit")
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
def is_valid_uuid_v4(UUID):
|
def is_valid_uuid_v4(UUID):
|
||||||
UUID = UUID.replace('-', '')
|
UUID = UUID.replace('-', '')
|
||||||
|
|
|
@ -6,15 +6,18 @@ import sys
|
||||||
import gzip
|
import gzip
|
||||||
import redis
|
import redis
|
||||||
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_FLASK'], 'modules'))
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
import Flask_config
|
import ConfigLoader
|
||||||
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages/'))
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages/'))
|
||||||
import Date
|
import Date
|
||||||
import Tag
|
import Tag
|
||||||
|
|
||||||
PASTES_FOLDER = Flask_config.PASTES_FOLDER
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
r_cache = Flask_config.r_cache
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
r_serv_metadata = Flask_config.r_serv_metadata
|
r_cache = config_loader.get_redis_conn("Redis_Cache")
|
||||||
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
def exist_item(item_id):
|
def exist_item(item_id):
|
||||||
if os.path.isfile(os.path.join(PASTES_FOLDER, item_id)):
|
if os.path.isfile(os.path.join(PASTES_FOLDER, item_id)):
|
||||||
|
|
|
@ -17,20 +17,22 @@ Conditions to fulfill to be able to use this class correctly:
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
import magic
|
import magic
|
||||||
import gzip
|
import gzip
|
||||||
import redis
|
import redis
|
||||||
import operator
|
import operator
|
||||||
import string
|
import string
|
||||||
import re
|
|
||||||
import json
|
import json
|
||||||
import configparser
|
|
||||||
from io import StringIO
|
from io import StringIO
|
||||||
import sys
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages/'))
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages/'))
|
||||||
from Date import Date
|
from Date import Date
|
||||||
from Hash import Hash
|
from Hash import Hash
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
from langid.langid import LanguageIdentifier, model
|
from langid.langid import LanguageIdentifier, model
|
||||||
|
|
||||||
from nltk.tokenize import RegexpTokenizer
|
from nltk.tokenize import RegexpTokenizer
|
||||||
|
@ -58,31 +60,12 @@ class Paste(object):
|
||||||
|
|
||||||
def __init__(self, p_path):
|
def __init__(self, p_path):
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
self.cache = config_loader.get_redis_conn("Redis_Queues")
|
||||||
raise Exception('Unable to find the configuration file. \
|
self.store = config_loader.get_redis_conn("Redis_Data_Merging")
|
||||||
Did you set environment variables? \
|
self.store_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
self.PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes"))
|
||||||
cfg.read(configfile)
|
|
||||||
self.cache = redis.StrictRedis(
|
|
||||||
host=cfg.get("Redis_Queues", "host"),
|
|
||||||
port=cfg.getint("Redis_Queues", "port"),
|
|
||||||
db=cfg.getint("Redis_Queues", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
self.store = redis.StrictRedis(
|
|
||||||
host=cfg.get("Redis_Data_Merging", "host"),
|
|
||||||
port=cfg.getint("Redis_Data_Merging", "port"),
|
|
||||||
db=cfg.getint("Redis_Data_Merging", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
self.store_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
self.PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes"))
|
|
||||||
if self.PASTES_FOLDER not in p_path:
|
if self.PASTES_FOLDER not in p_path:
|
||||||
self.p_rel_path = p_path
|
self.p_rel_path = p_path
|
||||||
self.p_path = os.path.join(self.PASTES_FOLDER, p_path)
|
self.p_path = os.path.join(self.PASTES_FOLDER, p_path)
|
||||||
|
|
|
@ -14,15 +14,20 @@ from collections import defaultdict
|
||||||
from nltk.tokenize import RegexpTokenizer
|
from nltk.tokenize import RegexpTokenizer
|
||||||
from textblob import TextBlob
|
from textblob import TextBlob
|
||||||
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_FLASK'], 'modules'))
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
import Flask_config
|
import ConfigLoader
|
||||||
|
|
||||||
from flask import escape
|
from flask import escape
|
||||||
|
|
||||||
import Date
|
import Date
|
||||||
import Item
|
import Item
|
||||||
|
|
||||||
r_serv_term = Flask_config.r_serv_term
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
email_regex = Flask_config.email_regex
|
r_serv_term = config_loader.get_redis_conn("ARDB_Tracker")
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
|
email_regex = r'[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,6}'
|
||||||
|
email_regex = re.compile(email_regex)
|
||||||
|
|
||||||
special_characters = set('[<>~!?@#$%^&*|()_-+={}":;,.\'\n\r\t]/\\')
|
special_characters = set('[<>~!?@#$%^&*|()_-+={}":;,.\'\n\r\t]/\\')
|
||||||
special_characters.add('\\s')
|
special_characters.add('\\s')
|
||||||
|
|
|
@ -2,9 +2,12 @@
|
||||||
# -*-coding:UTF-8 -*
|
# -*-coding:UTF-8 -*
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
import redis
|
import redis
|
||||||
import bcrypt
|
import bcrypt
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
from flask_login import UserMixin
|
from flask_login import UserMixin
|
||||||
|
|
||||||
|
@ -12,20 +15,10 @@ class User(UserMixin):
|
||||||
|
|
||||||
def __init__(self, id):
|
def __init__(self, id):
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
self.r_serv_db = config_loader.get_redis_conn("ARDB_DB")
|
||||||
cfg.read(configfile)
|
config_loader = None
|
||||||
|
|
||||||
self.r_serv_db = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
if self.r_serv_db.hexists('user:all', id):
|
if self.r_serv_db.hexists('user:all', id):
|
||||||
self.id = id
|
self.id = id
|
||||||
|
|
|
@ -1,14 +1,20 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/python3
|
||||||
|
|
||||||
import re
|
|
||||||
import os
|
import os
|
||||||
import configparser
|
import re
|
||||||
|
import sys
|
||||||
import dns.resolver
|
import dns.resolver
|
||||||
|
|
||||||
from pubsublogger import publisher
|
from pubsublogger import publisher
|
||||||
|
|
||||||
from datetime import timedelta
|
from datetime import timedelta
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
|
dns_server = config_loader.get_config_str("Web", "dns")
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
def is_luhn_valid(card_number):
|
def is_luhn_valid(card_number):
|
||||||
"""Apply the Luhn algorithm to validate credit card.
|
"""Apply the Luhn algorithm to validate credit card.
|
||||||
|
@ -103,14 +109,6 @@ def checking_MX_record(r_serv, adress_set, addr_dns):
|
||||||
|
|
||||||
|
|
||||||
def checking_A_record(r_serv, domains_set):
|
def checking_A_record(r_serv, domains_set):
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
dns_server = cfg.get("Web", "dns")
|
|
||||||
|
|
||||||
score = 0
|
score = 0
|
||||||
num = len(domains_set)
|
num = len(domains_set)
|
||||||
|
|
|
@ -1,7 +1,6 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
# -*-coding:UTF-8 -*
|
# -*-coding:UTF-8 -*
|
||||||
|
|
||||||
import configparser
|
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import gzip
|
import gzip
|
||||||
|
@ -17,6 +16,9 @@ import sflock
|
||||||
from Helper import Process
|
from Helper import Process
|
||||||
from pubsublogger import publisher
|
from pubsublogger import publisher
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
def create_paste(uuid, paste_content, ltags, ltagsgalaxies, name):
|
def create_paste(uuid, paste_content, ltags, ltagsgalaxies, name):
|
||||||
|
|
||||||
now = datetime.datetime.now()
|
now = datetime.datetime.now()
|
||||||
|
@ -154,44 +156,13 @@ if __name__ == "__main__":
|
||||||
publisher.port = 6380
|
publisher.port = 6380
|
||||||
publisher.channel = "Script"
|
publisher.channel = "Script"
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
r_serv_db = config_loader.get_redis_conn("ARDB_DB")
|
||||||
cfg.read(configfile)
|
r_serv_log_submit = config_loader.get_redis_conn("Redis_Log_submit")
|
||||||
|
r_serv_tags = config_loader.get_redis_conn("ARDB_Tags")
|
||||||
r_serv_db = redis.StrictRedis(
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
serv_statistics = config_loader.get_redis_conn("ARDB_Statistics")
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_log_submit = redis.StrictRedis(
|
|
||||||
host=cfg.get("Redis_Log_submit", "host"),
|
|
||||||
port=cfg.getint("Redis_Log_submit", "port"),
|
|
||||||
db=cfg.getint("Redis_Log_submit", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_tags = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Tags", "host"),
|
|
||||||
port=cfg.getint("ARDB_Tags", "port"),
|
|
||||||
db=cfg.getint("ARDB_Tags", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
serv_statistics = redis.StrictRedis(
|
|
||||||
host=cfg.get('ARDB_Statistics', 'host'),
|
|
||||||
port=cfg.getint('ARDB_Statistics', 'port'),
|
|
||||||
db=cfg.getint('ARDB_Statistics', 'db'),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
expire_time = 120
|
expire_time = 120
|
||||||
MAX_FILE_SIZE = 1000000000
|
MAX_FILE_SIZE = 1000000000
|
||||||
|
@ -200,7 +171,9 @@ if __name__ == "__main__":
|
||||||
config_section = 'submit_paste'
|
config_section = 'submit_paste'
|
||||||
p = Process(config_section)
|
p = Process(config_section)
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
|
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
|
|
||||||
|
|
|
@ -5,29 +5,21 @@ import os
|
||||||
import sys
|
import sys
|
||||||
import json
|
import json
|
||||||
import redis
|
import redis
|
||||||
import configparser
|
|
||||||
from TorSplashCrawler import TorSplashCrawler
|
from TorSplashCrawler import TorSplashCrawler
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
||||||
if len(sys.argv) != 2:
|
if len(sys.argv) != 2:
|
||||||
print('usage:', 'tor_crawler.py', 'uuid')
|
print('usage:', 'tor_crawler.py', 'uuid')
|
||||||
exit(1)
|
exit(1)
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
cfg.read(configfile)
|
redis_cache = config_loader.get_redis_conn("Redis_Cache")
|
||||||
|
config_loader = None
|
||||||
redis_cache = redis.StrictRedis(
|
|
||||||
host=cfg.get("Redis_Cache", "host"),
|
|
||||||
port=cfg.getint("Redis_Cache", "port"),
|
|
||||||
db=cfg.getint("Redis_Cache", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
# get crawler config key
|
# get crawler config key
|
||||||
uuid = sys.argv[1]
|
uuid = sys.argv[1]
|
||||||
|
|
|
@ -13,23 +13,16 @@ import os
|
||||||
import sys
|
import sys
|
||||||
import redis
|
import redis
|
||||||
import subprocess
|
import subprocess
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
config_loader = None
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
if r_serv.scard('ail:update_v1.5') != 5:
|
if r_serv.scard('ail:update_v1.5') != 5:
|
||||||
r_serv.delete('ail:update_error')
|
r_serv.delete('ail:update_error')
|
||||||
|
|
|
@ -81,8 +81,8 @@ pushd ardb/
|
||||||
make
|
make
|
||||||
popd
|
popd
|
||||||
|
|
||||||
if [ ! -f bin/packages/config.cfg ]; then
|
if [ ! -f configs/core.cfg ]; then
|
||||||
cp bin/packages/config.cfg.sample bin/packages/config.cfg
|
cp configs/core.cfg.sample configs/core.cfg
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if [ -z "$VIRTUAL_ENV" ]; then
|
if [ -z "$VIRTUAL_ENV" ]; then
|
||||||
|
|
|
@ -9,6 +9,9 @@ import argparse
|
||||||
import datetime
|
import datetime
|
||||||
import configparser
|
import configparser
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
parser = argparse.ArgumentParser(description='AIL default update')
|
parser = argparse.ArgumentParser(description='AIL default update')
|
||||||
parser.add_argument('-t', help='version tag' , type=str, dest='tag', required=True)
|
parser.add_argument('-t', help='version tag' , type=str, dest='tag', required=True)
|
||||||
|
@ -23,19 +26,9 @@ if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
raise Exception('Unable to find the configuration file. \
|
config_loader = None
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
#Set current ail version
|
#Set current ail version
|
||||||
r_serv.set('ail:version', update_tag)
|
r_serv.set('ail:version', update_tag)
|
||||||
|
|
|
@ -5,7 +5,9 @@ import os
|
||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
def update_tracked_terms(main_key, tracked_container_key):
|
def update_tracked_terms(main_key, tracked_container_key):
|
||||||
for tracked_item in r_serv_term.smembers(main_key):
|
for tracked_item in r_serv_term.smembers(main_key):
|
||||||
|
@ -50,45 +52,16 @@ if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
r_serv_tag = config_loader.get_redis_conn("ARDB_Tags")
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
r_serv_term = config_loader.get_redis_conn("ARDB_TermFreq")
|
||||||
decode_responses=True)
|
r_serv_onion = config_loader.get_redis_conn("ARDB_Onion")
|
||||||
|
config_loader = None
|
||||||
r_serv_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_tag = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Tags", "host"),
|
|
||||||
port=cfg.getint("ARDB_Tags", "port"),
|
|
||||||
db=cfg.getint("ARDB_Tags", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_term = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_TermFreq", "host"),
|
|
||||||
port=cfg.getint("ARDB_TermFreq", "port"),
|
|
||||||
db=cfg.getint("ARDB_TermFreq", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_onion = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Onion", "host"),
|
|
||||||
port=cfg.getint("ARDB_Onion", "port"),
|
|
||||||
db=cfg.getint("ARDB_Onion", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv.set('ail:current_background_script', 'metadata')
|
r_serv.set('ail:current_background_script', 'metadata')
|
||||||
|
|
||||||
|
|
|
@ -6,7 +6,9 @@ import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import datetime
|
import datetime
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
def substract_date(date_from, date_to):
|
def substract_date(date_from, date_to):
|
||||||
date_from = datetime.date(int(date_from[0:4]), int(date_from[4:6]), int(date_from[6:8]))
|
date_from = datetime.date(int(date_from[0:4]), int(date_from[4:6]), int(date_from[6:8]))
|
||||||
|
@ -39,39 +41,15 @@ if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
r_serv_tag = config_loader.get_redis_conn("ARDB_Tags")
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
r_serv_onion = config_loader.get_redis_conn("ARDB_Onion")
|
||||||
decode_responses=True)
|
config_loader = None
|
||||||
|
|
||||||
r_serv_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_tag = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Tags", "host"),
|
|
||||||
port=cfg.getint("ARDB_Tags", "port"),
|
|
||||||
db=cfg.getint("ARDB_Tags", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_onion = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Onion", "host"),
|
|
||||||
port=cfg.getint("ARDB_Onion", "port"),
|
|
||||||
db=cfg.getint("ARDB_Onion", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv.set('ail:current_background_script', 'onions')
|
r_serv.set('ail:current_background_script', 'onions')
|
||||||
r_serv.set('ail:current_background_script_stat', 0)
|
r_serv.set('ail:current_background_script_stat', 0)
|
||||||
|
|
|
@ -6,10 +6,12 @@ import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import datetime
|
import datetime
|
||||||
import configparser
|
|
||||||
|
|
||||||
from hashlib import sha256
|
from hashlib import sha256
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
def rreplace(s, old, new, occurrence):
|
def rreplace(s, old, new, occurrence):
|
||||||
li = s.rsplit(old, occurrence)
|
li = s.rsplit(old, occurrence)
|
||||||
return new.join(li)
|
return new.join(li)
|
||||||
|
@ -28,41 +30,18 @@ if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
SCREENSHOT_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "crawled_screenshot"))
|
|
||||||
NEW_SCREENSHOT_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "crawled_screenshot"), 'screenshot')
|
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
SCREENSHOT_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "crawled_screenshot"))
|
||||||
|
NEW_SCREENSHOT_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "crawled_screenshot"), 'screenshot')
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_metadata = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
r_serv_tag = config_loader.get_redis_conn("ARDB_Tags")
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
r_serv_onion = config_loader.get_redis_conn("ARDB_Onion")
|
||||||
decode_responses=True)
|
config_loader = None
|
||||||
|
|
||||||
r_serv_tag = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Tags", "host"),
|
|
||||||
port=cfg.getint("ARDB_Tags", "port"),
|
|
||||||
db=cfg.getint("ARDB_Tags", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_onion = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Onion", "host"),
|
|
||||||
port=cfg.getint("ARDB_Onion", "port"),
|
|
||||||
db=cfg.getint("ARDB_Onion", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv.set('ail:current_background_script', 'crawled_screenshot')
|
r_serv.set('ail:current_background_script', 'crawled_screenshot')
|
||||||
r_serv.set('ail:current_background_script_stat', 0)
|
r_serv.set('ail:current_background_script_stat', 0)
|
||||||
|
|
|
@ -5,58 +5,36 @@ import os
|
||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_tag = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Tags", "host"),
|
|
||||||
port=cfg.getint("ARDB_Tags", "port"),
|
|
||||||
db=cfg.getint("ARDB_Tags", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_onion = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Onion", "host"),
|
|
||||||
port=cfg.getint("ARDB_Onion", "port"),
|
|
||||||
db=cfg.getint("ARDB_Onion", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
|
r_serv_tag = config_loader.get_redis_conn("ARDB_Tags")
|
||||||
|
r_serv_onion = config_loader.get_redis_conn("ARDB_Onion")
|
||||||
r_important_paste_2018 = redis.StrictRedis(
|
r_important_paste_2018 = redis.StrictRedis(
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
host=config_loader.get_config_str("ARDB_Metadata", "host"),
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
port=config_loader.get_config_int("ARDB_Metadata", "port"),
|
||||||
db=2018,
|
db=2018,
|
||||||
decode_responses=True)
|
decode_responses=True)
|
||||||
|
|
||||||
r_important_paste_2019 = redis.StrictRedis(
|
r_important_paste_2019 = redis.StrictRedis(
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
host=config_loader.get_config_str("ARDB_Metadata", "host"),
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
port=config_loader.get_config_int("ARDB_Metadata", "port"),
|
||||||
db=2018,
|
db=2019,
|
||||||
decode_responses=True)
|
decode_responses=True)
|
||||||
|
|
||||||
|
config_loader = None
|
||||||
|
|
||||||
r_serv.set('ail:current_background_script', 'tags')
|
r_serv.set('ail:current_background_script', 'tags')
|
||||||
r_serv.set('ail:current_background_script_stat', 0)
|
r_serv.set('ail:current_background_script_stat', 0)
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,9 @@ import os
|
||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
def tags_key_fusion(old_item_path_key, new_item_path_key):
|
def tags_key_fusion(old_item_path_key, new_item_path_key):
|
||||||
print('fusion:')
|
print('fusion:')
|
||||||
|
@ -19,33 +21,14 @@ if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
r_serv_metadata = config_loader.get_redis_conn("ARDB_Metadata")
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
r_serv_tag = config_loader.get_redis_conn("ARDB_Tags")
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
config_loader = None
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_metadata = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Metadata", "host"),
|
|
||||||
port=cfg.getint("ARDB_Metadata", "port"),
|
|
||||||
db=cfg.getint("ARDB_Metadata", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_tag = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Tags", "host"),
|
|
||||||
port=cfg.getint("ARDB_Tags", "port"),
|
|
||||||
db=cfg.getint("ARDB_Tags", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
if r_serv.sismember('ail:update_v1.5', 'tags'):
|
if r_serv.sismember('ail:update_v1.5', 'tags'):
|
||||||
|
|
||||||
|
|
|
@ -6,33 +6,21 @@ import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import datetime
|
import datetime
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], cfg.get("Directories", "pastes")) + '/'
|
PASTES_FOLDER = os.path.join(os.environ['AIL_HOME'], config_loader.get_config_str("Directories", "pastes")) + '/'
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
r_serv_onion = config_loader.get_redis_conn("ARDB_Onion")
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
config_loader = None
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_onion = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_Onion", "host"),
|
|
||||||
port=cfg.getint("ARDB_Onion", "port"),
|
|
||||||
db=cfg.getint("ARDB_Onion", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
print()
|
print()
|
||||||
print('Updating ARDB_Onion ...')
|
print('Updating ARDB_Onion ...')
|
||||||
|
|
|
@ -6,25 +6,18 @@ import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import datetime
|
import datetime
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
config_loader = None
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
#Set current ail version
|
#Set current ail version
|
||||||
r_serv.set('ail:version', 'v1.7')
|
r_serv.set('ail:version', 'v1.7')
|
||||||
|
|
|
@ -6,25 +6,18 @@ import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import datetime
|
import datetime
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
config_loader = None
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
#Set current ail version
|
#Set current ail version
|
||||||
r_serv.set('ail:version', 'v2.0')
|
r_serv.set('ail:version', 'v2.0')
|
||||||
|
|
|
@ -7,12 +7,14 @@ import sys
|
||||||
import time
|
import time
|
||||||
import redis
|
import redis
|
||||||
import datetime
|
import datetime
|
||||||
import configparser
|
|
||||||
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages'))
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'packages'))
|
||||||
import Item
|
import Item
|
||||||
import Term
|
import Term
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
|
|
||||||
def rreplace(s, old, new, occurrence):
|
def rreplace(s, old, new, occurrence):
|
||||||
li = s.rsplit(old, occurrence)
|
li = s.rsplit(old, occurrence)
|
||||||
|
@ -23,25 +25,11 @@ if __name__ == '__main__':
|
||||||
|
|
||||||
start_deb = time.time()
|
start_deb = time.time()
|
||||||
|
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg.sample')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
r_serv_term_stats = redis.StrictRedis(
|
r_serv_term_stats = config_loader.get_redis_conn("ARDB_Trending")
|
||||||
host=cfg.get("ARDB_Trending", "host"),
|
r_serv_termfreq = config_loader.get_redis_conn("ARDB_TermFreq")
|
||||||
port=cfg.getint("ARDB_Trending", "port"),
|
config_loader = None
|
||||||
db=cfg.getint("ARDB_Trending", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_termfreq = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_TermFreq", "host"),
|
|
||||||
port=cfg.getint("ARDB_TermFreq", "port"),
|
|
||||||
db=cfg.getint("ARDB_TermFreq", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
r_serv_term_stats.flushdb()
|
r_serv_term_stats.flushdb()
|
||||||
|
|
||||||
|
|
|
@ -11,7 +11,6 @@ import redis
|
||||||
import random
|
import random
|
||||||
import logging
|
import logging
|
||||||
import logging.handlers
|
import logging.handlers
|
||||||
import configparser
|
|
||||||
|
|
||||||
from flask import Flask, render_template, jsonify, request, Request, Response, session, redirect, url_for
|
from flask import Flask, render_template, jsonify, request, Request, Response, session, redirect, url_for
|
||||||
from flask_login import LoginManager, current_user, login_user, logout_user, login_required
|
from flask_login import LoginManager, current_user, login_user, logout_user, login_required
|
||||||
|
|
|
@ -4,28 +4,18 @@
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import redis
|
import redis
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
sys.path.append(os.path.join(os.environ['AIL_FLASK'], 'modules'))
|
sys.path.append(os.path.join(os.environ['AIL_FLASK'], 'modules'))
|
||||||
|
|
||||||
from Role_Manager import create_user_db, edit_user_db, get_default_admin_token, gen_password
|
from Role_Manager import create_user_db, edit_user_db, get_default_admin_token, gen_password
|
||||||
|
|
||||||
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
|
|
||||||
|
r_serv = config_loader.get_redis_conn("ARDB_DB")
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = None
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
cfg = configparser.ConfigParser()
|
|
||||||
cfg.read(configfile)
|
|
||||||
|
|
||||||
r_serv = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|
||||||
|
|
|
@ -3,9 +3,12 @@
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
|
import sys
|
||||||
import redis
|
import redis
|
||||||
import bcrypt
|
import bcrypt
|
||||||
import configparser
|
|
||||||
|
sys.path.append(os.path.join(os.environ['AIL_BIN'], 'lib/'))
|
||||||
|
import ConfigLoader
|
||||||
|
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
from flask_login import LoginManager, current_user, login_user, logout_user, login_required
|
from flask_login import LoginManager, current_user, login_user, logout_user, login_required
|
||||||
|
@ -16,20 +19,10 @@ login_manager = LoginManager()
|
||||||
login_manager.login_view = 'role'
|
login_manager.login_view = 'role'
|
||||||
|
|
||||||
# CONFIG #
|
# CONFIG #
|
||||||
configfile = os.path.join(os.environ['AIL_BIN'], 'packages/config.cfg')
|
config_loader = ConfigLoader.ConfigLoader()
|
||||||
if not os.path.exists(configfile):
|
|
||||||
raise Exception('Unable to find the configuration file. \
|
|
||||||
Did you set environment variables? \
|
|
||||||
Or activate the virtualenv.')
|
|
||||||
|
|
||||||
cfg = configparser.ConfigParser()
|
r_serv_db = config_loader.get_redis_conn("ARDB_DB")
|
||||||
cfg.read(configfile)
|
config_loader = None
|
||||||
|
|
||||||
r_serv_db = redis.StrictRedis(
|
|
||||||
host=cfg.get("ARDB_DB", "host"),
|
|
||||||
port=cfg.getint("ARDB_DB", "port"),
|
|
||||||
db=cfg.getint("ARDB_DB", "db"),
|
|
||||||
decode_responses=True)
|
|
||||||
|
|
||||||
default_passwd_file = os.path.join(os.environ['AIL_HOME'], 'DEFAULT_PASSWORD')
|
default_passwd_file = os.path.join(os.environ['AIL_HOME'], 'DEFAULT_PASSWORD')
|
||||||
|
|
||||||
|
|
|
@ -386,15 +386,19 @@ img.onload = pixelate;
|
||||||
img.addEventListener("error", img_error);
|
img.addEventListener("error", img_error);
|
||||||
var draw_img = false;
|
var draw_img = false;
|
||||||
|
|
||||||
{%if dict_domain['crawler_history']['random_item']['screenshot']%}
|
{%if "crawler_history" in dict_domain%}
|
||||||
var screenshot = "{{dict_domain['crawler_history']['random_item']['screenshot']}}";
|
{%if dict_domain['crawler_history']['random_item']['screenshot']%}
|
||||||
var selected_icon = $("#"+screenshot.replace(/\//g, ""));
|
var screenshot = "{{dict_domain['crawler_history']['random_item']['screenshot']}}";
|
||||||
selected_icon.addClass("icon_selected");
|
var selected_icon = $("#"+screenshot.replace(/\//g, ""));
|
||||||
selected_icon.removeClass("icon_img");
|
selected_icon.addClass("icon_selected");
|
||||||
|
selected_icon.removeClass("icon_img");
|
||||||
|
|
||||||
|
|
||||||
$("#screenshot_link").attr("href", "screenshot_href + {{dict_domain['crawler_history']['random_item']['id']}}");
|
$("#screenshot_link").attr("href", "screenshot_href + {{dict_domain['crawler_history']['random_item']['id']}}");
|
||||||
$("#screenshot_link").text("{{dict_domain['crawler_history']['random_item']['link']}}");
|
$("#screenshot_link").text("{{dict_domain['crawler_history']['random_item']['link']}}");
|
||||||
|
{%else%}
|
||||||
|
var screenshot = "";
|
||||||
|
{%endif%}
|
||||||
{%else%}
|
{%else%}
|
||||||
var screenshot = "";
|
var screenshot = "";
|
||||||
{%endif%}
|
{%endif%}
|
||||||
|
|
Loading…
Reference in a new issue