25 files changed, 464 insertions(+), 365 deletions(-)

This commit is contained in:
Good Evening 2018-08-06 23:22:28 +03:00
parent 29058d9529
commit 5148691174
25 changed files with 463 additions and 364 deletions

6
.pylintrc Normal file
View File

@ -0,0 +1,6 @@
[MASTER]
extension-pkg-whitelist=GeoIP
[FORMAT]
indent-string=' '
indent-after-paren=2

180
README.md
View File

@ -2,130 +2,143 @@
alpha beta whatever alpha beta whatever
partly works partly works
## roadmap
refactor README and lib/plugin/plugins/*
cleanup *
probably Listener shoul be part of Core in order to supervise everything
tasks don't store results currently. need to implement some validation of performed tasks (at least in Executor thread before spreading new tasks)
http://python-rq.org/docs/results/
## configuration ## configuration
`data/config.yaml` `data/config.yaml`
``` ```
# lists top-level services (like listeners, executors, data-manager) and pipelines enabled ---
dsl_version: 1
core: core:
services: # should point to correct service services:
- data_manager - random_ip
- rq_executor - rq_executor
- tg_feed
pipelines: pipelines:
- ftp - ftp
- gopher
# describes top-level services and their configurations
services: services:
data_manager:
# REQUIRED package name, just like path to file with dots
package: lib.data.Manager
# REQUIRED class inherited from Service (lib.Service)
service: DataManager
# REQUIRED used to select one of storages
data:
id: pool
# there can be more service-specific configuration fields
# for now they can be found in code :)
sources:
- random_ip
feeds:
- test_telegram
rq_executor:
package: lib.exeq.Executor
service: RQExecutor
data:
id: pool
redis:
host: "127.0.0.1"
# describes datasources for data_manager
sources:
random_ip: random_ip:
package: lib.plugin.base.lib.IP package: lib.plugin.base.lib.IP
service: RandomIP service: RandomIP
data: storage: ip_source
id: random_ip rq_executor:
package: lib.exec.Executor
# describes datafeeds for data_manager service: RQExecutor
feeds: storage: pool
test_telegram: # doesn't work yet, eh redis:
host: "127.0.0.1"
tg_feed:
package: lib.plugin.base.lib.Telegram package: lib.plugin.base.lib.Telegram
service: TelegramFeed service: TelegramFeed
data: storage: pool
id: pool token: "mocken"
token:
chats: chats:
- id: good_evening - id: aiWeipeighah7vufoHa0ieToipooYe
pipelines: [ftp, gopher] if:
filter: steps.ftp_apply_tpl: true
clause: any-of data.filter: false
equal: - id: ohl7AeGah5uo8cho4nae9Eemaeyae3
- ftp_list_files_status: success if:
- gopher_collect_status: success steps.gopher_apply_tpl: true
data.filter: false
# describes various storages, e.g. data pool for pipelines or queues for datasources
storage: storage:
pool: pool:
# REQUIRED
package: lib.plugin.base.lib.Mongo package: lib.plugin.base.lib.Mongo
service: MongoStorage service: MongoStorage
size: 40960 size: 0
# service-specific
db: "medved" db: "medved"
coll: 'pool' coll: 'pool'
random_ip: ip_source:
package: lib.plugin.base.lib.Mongo package: lib.plugin.base.lib.Mongo
service: MongoStorage service: MongoStorage
size: 500 size: 800
db: "medved" db: "medved"
coll: 'randomipsource' coll: 'ip_source'
# describes available pipelines
pipelines: pipelines:
ftp: ftp:
# list of steps with dependencies source: ip_source
steps: steps:
# will pass 10 items to lib.plugin.iscan.tasks.common.scan - task: ftp_scan
- name: scan priority: low
package: lib.plugin.iscan.tasks.common parallel: 100
service: scan - task: ftp_connect
multiple: 10 # default: False priority: normal
requires: [] if:
# will pass 1 item marked with ftp_scan to lib.plugin.iscan.tasks.ftp.connect steps.ftp_scan: true
- name: connect - task: ftp_list_files
package: lib.plugin.iscan.tasks.ftp priority: high
service: connect if:
requires: steps.ftp_connect: true
- ftp_scan - task: ftp_apply_tpl
- name: list_files priority: high
package: lib.plugin.iscan.tasks.ftp if:
service: list_files steps.ftp_list_files: true
requires: gopher:
- ftp_connect source: ip_source
steps:
- task: gopher_scan
priority: normal
parallel: 100
- task: gopher_find
priority: high
if:
steps.gopher_scan: true
- task: gopher_apply_tpl
priority: high
if:
steps.gopher_find: true
http:
source: ip_source
steps:
- task: http_scan
priority: low
parallel: 25
# various configurations for tasks
tasks: tasks:
gopher_scan:
package: lib.plugin.iscan.tasks.common
service: MasScanTask
ports:
- 70
gopher_find:
package: lib.plugin.iscan.tasks.gopher
service: GopherFindTask
gopher_apply_tpl:
package: lib.plugin.base.tasks.text
service: Jinja2TemplateTask
path: lib/plugin/iscan/templates/gopher.tpl
ftp_scan: ftp_scan:
package: lib.plugin.iscan.tasks.common
service: MasScanTask
ports: ports:
- 21 - 21
ftp_connect: ftp_connect:
package: lib.plugin.iscan.tasks.ftp
service: FTPConnectTask
logins: data/ftp/logins.txt logins: data/ftp/logins.txt
passwords: data/ftp/passwords.txt passwords: data/ftp/passwords.txt
bruteforce: true bruteforce: true
timeout: 15 timeout: 15
ftp_list_files: ftp_list_files:
package: lib.plugin.iscan.tasks.ftp
service: FTPListFilesTask
filter: true
ftp_apply_tpl:
package: lib.plugin.base.tasks.text
service: Jinja2TemplateTask
path: lib/plugin/iscan/templates/ftp.tpl
logging: logging:
Storage: INFO Storage: DEBUG
Loader: DEBUG
``` ```
probably it can be launched with docker, however I didn't test it yet probably it can be launched with docker, however I didn't test it yet
@ -137,9 +150,10 @@ you'll need working redis and mongodb for default configuration
## top-level services ## top-level services
### lib.data.Manager.DataManager ### sources ###
Orchestrates datasources and datafeeds - starts and stops them, also checks pool size. If it is too low - takes data from DS. ### feeds ###
### lib.exeq.Executor.RQExecutor
### lib.exec.Executor.RQExecutor
Should run pipelines described in configuration. Works via [RedisQueue](http://python-rq.org/), so needs some Redis up and running Should run pipelines described in configuration. Works via [RedisQueue](http://python-rq.org/), so needs some Redis up and running
Basically takes data from pool and submits it to workers. Basically takes data from pool and submits it to workers.
RQ workers should be launched separately (`rqworker worker` from code root) RQ workers should be launched separately (`rqworker worker` from code root)

View File

@ -5,6 +5,7 @@ core:
services: services:
- random_ip - random_ip
- rq_executor - rq_executor
- GC
- tg_feed - tg_feed
pipelines: pipelines:
- ftp - ftp
@ -16,21 +17,28 @@ services:
service: RandomIP service: RandomIP
storage: ip_source storage: ip_source
rq_executor: rq_executor:
package: lib.exeq.Executor package: lib.exec.Executor
service: RQExecutor service: RQExecutor
storage: pool storage: pool
redis: redis:
host: redis host: redis
GC:
package: lib.plugin.base.lib.GC
service: GarbageCollector
storage: pool
delay: 10
if:
steps.ftp_scan: false
steps.gopher_scan: false
tg_feed: tg_feed:
package: lib.plugin.base.lib.Telegram package: lib.plugin.base.lib.Telegram
service: TelegramFeed service: TelegramFeed
storage: pool storage: pool
token: "3" token: "358947254:"
chats: chats:
- id: aiWeipeighah7vufoHa0ieToipooYe - id: aiWeipeighah7vufoHa0ieToipooYe
if: if:
steps.ftp_apply_tpl: true steps.ftp_apply_tpl: true
data.filter: false
- id: ohl7AeGah5uo8cho4nae9Eemaeyae3 - id: ohl7AeGah5uo8cho4nae9Eemaeyae3
if: if:
steps.gopher_apply_tpl: true steps.gopher_apply_tpl: true
@ -65,13 +73,19 @@ pipelines:
if: if:
steps.ftp_scan: true steps.ftp_scan: true
- task: ftp_list_files - task: ftp_list_files
priority: high priority: normal
if: if:
steps.ftp_connect: true steps.ftp_connect: true
- task: ftp_filter_files
priority: normal
parallel: 100
if:
steps.ftp_list_files: true
- task: ftp_apply_tpl - task: ftp_apply_tpl
priority: high priority: high
if: if:
steps.ftp_list_files: true steps.ftp_filter_files: true
data.filter: false
gopher: gopher:
source: ip_source source: ip_source
steps: steps:
@ -124,11 +138,15 @@ tasks:
http_scan: http_scan:
package: lib.plugin.iscan.tasks.common package: lib.plugin.iscan.tasks.common
service: MasScanTask service: MasScanTask
ports: ports: &http_ports
- 80 - 80
- 81 - 81
- 8080 - 8080
- 8081 - 8081
http_find:
package: lib.plugin.iscan.tasks.http
service: HTTPFindTask
ftp_scan: ftp_scan:
package: lib.plugin.iscan.tasks.common package: lib.plugin.iscan.tasks.common
@ -138,19 +156,21 @@ tasks:
ftp_connect: ftp_connect:
package: lib.plugin.iscan.tasks.ftp package: lib.plugin.iscan.tasks.ftp
service: FTPConnectTask service: FTPConnectTask
logins: data/ftp/logins.txt usernames: data/ftp/usernames.txt
passwords: data/ftp/passwords.txt passwords: data/ftp/passwords.txt
bruteforce: true bruteforce: true
timeout: 15 timeout: 15
ftp_list_files: ftp_list_files:
package: lib.plugin.iscan.tasks.ftp package: lib.plugin.iscan.tasks.ftp
service: FTPListFilesTask service: FTPListFilesTask
filter: true ftp_filter_files:
package: lib.plugin.iscan.tasks.ftp
service: FTPListFilesTask
ftp_apply_tpl: ftp_apply_tpl:
package: lib.plugin.base.tasks.text package: lib.plugin.base.tasks.text
service: Jinja2TemplateTask service: Jinja2TemplateTask
path: lib/plugin/iscan/templates/ftp.tpl path: lib/plugin/iscan/templates/ftp.tpl
logging: logging:
Storage: INFO Storage: DEBUG
Loader: INFO Loader: INFO

View File

@ -1,3 +1,8 @@
"""
Provides Service class
"""
from time import sleep from time import sleep
from threading import Thread from threading import Thread
from lib import Logger, Loader, Loadable from lib import Logger, Loader, Loadable
@ -23,6 +28,7 @@ class Service(Loadable):
pass pass
def start(self): def start(self):
"""Executes pre_start, starts thread and executes post_start"""
self._logger.debug('pre_start') self._logger.debug('pre_start')
self._pre_start() self._pre_start()
@ -38,6 +44,7 @@ class Service(Loadable):
self._logger.info('start finished') self._logger.info('start finished')
def stop(self): def stop(self):
"""Executes pre_stop, stops thread and executes post_stop"""
self._logger.debug('pre_stop') self._logger.debug('pre_stop')
self._pre_stop() self._pre_stop()

View File

@ -1,20 +1,8 @@
from queue import LifoQueue
from time import sleep
import itertools
from lib.net import Remote
from lib import Service from lib import Service
class Feed(Service): class Feed(Service):
"""Base class for datafeeds""" """Base class for datafeeds"""
def __init__(self, thread, id, root): def __init__(self, thread, id, root):
super().__init__(thread, id, root) super().__init__(thread, id, root)
self._logger.add_field('service', 'Feed') self._logger.add_field('service', 'Feed')
self._logger.add_field('vname', self.__class__.__name__) self._logger.add_field('vname', self.__class__.__name__)
def get(self, plugin, count=1, timeout=3):
items = self._data.get(count)
self._logger.debug("get %s OF %s", len(items), count)
return items

View File

@ -1,10 +1,12 @@
import copy
from lib import Service from lib import Service
class Source(Service): class Source(Service):
"""Base class for datasources""" """Base class for datasources"""
def __init__(self, thread, id, root): def __init__(self, thread, id, root):
super().__init__(thread, id, root) super().__init__(thread, id, root)
self._logger.add_field('service', 'Feed') self._logger.add_field('service', 'Source')
self._logger.add_field('vname', self.__class__.__name__) self._logger.add_field('vname', self.__class__.__name__)
self._item = { self._item = {
@ -13,8 +15,8 @@ class Source(Service):
'data': {} 'data': {}
} }
def next(self, count=10, block=False): def _create(self):
if self._running or not self._data.count() == 0: return copy.deepcopy(self._item)
return self._data.get(count=count, block=block)
elif self._data.count() == 0: def _prepare(self, item):
raise Exception("Storage is empty, generator is stopped") pass

View File

@ -1,4 +1,4 @@
from queue import LifoQueue, Empty, Full import inspect
from lib import Loadable, Logger from lib import Loadable, Logger
@ -27,7 +27,9 @@ class Storage(Loadable):
return items return items
def get(self, count=1, block=True, filter=None): def get(self, count=1, block=True, filter=None):
self._logger.debug("get %s, %s", count, block) """Returns items, removing them from storage"""
self._logger.debug("get|%s|%s|%s",
count, block, inspect.stack()[1][0].f_locals["self"].__class__.__name__)
items = [] items = []
if count == 1: if count == 1:
items.append(self._get(block, filter)) items.append(self._get(block, filter))
@ -44,55 +46,37 @@ class Storage(Loadable):
self._put(i, block) self._put(i, block)
def put(self, items, block=True): def put(self, items, block=True):
"""Puts provided items"""
self._logger.debug("put|%s|%s|%s",
len(items), block, inspect.stack()[1][0].f_locals["self"].__class__.__name__)
if items: if items:
items = [i for i in items if i is not None] items = [i for i in items if i is not None]
self._logger.debug("put %s, %s", len(items), block)
if len(items) == 1: if len(items) == 1:
self._put(items[0], block) self._put(items[0], block)
elif len(items) > 1: elif len(items) > 1:
self._put_many(items, block) self._put_many(items, block)
def _find(self): def _find(self, filter):
pass pass
def find(self): def find(self, filter):
self._logger.debug("find") """Returns items without removing them from storage"""
return self._find() return self._find(filter)
def _update(self, items, update): def _update(self, items, update):
pass pass
def update(self, items, update=None): def update(self, items, update=None):
"""Updates provided items"""
self._logger.debug("update|%s|%s",
len(items), inspect.stack()[1][0].f_locals["self"].__class__.__name__)
if items: if items:
items = [i for i in items if i is not None] items = [i for i in items if i is not None]
self._logger.debug("update %s, %s", len(items), update)
self._update(items, update) self._update(items, update)
def _remove(self, items): def _remove(self, items):
pass pass
def remove(self, items): def remove(self, items):
"""Removes provided items"""
self._remove(items) self._remove(items)
class LiFoStorage(Storage):
def __init__(self, id, root):
super().__init__(id, root)
self._data = LifoQueue()
def count(self):
return self._data.qsize()
def _get(self, block=False, filter=None):
try:
return self._data.get(block=block)
except Empty:
pass
def _put(self, item, block=True):
try:
self._data.put(item, block=block)
except Full:
pass

View File

@ -57,7 +57,7 @@ class RQExecutor(Executor):
for i in items: for i in items:
i['steps'][step['task']] = None i['steps'][step['task']] = None
self._data.update(items) self._data.update(items)
job = q.enqueue("lib.exeq.Task.run", step['task'], items) job = q.enqueue("lib.exec.Task.run", step['task'], items)
self._logger.info("%s|%s|%s|%s", job.id, step.get('priority', 'normal'), step['task'], len(items)) self._logger.info("%s|%s|%s|%s", job.id, step.get('priority', 'normal'), step['task'], len(items))
jobs.append(job.id) jobs.append(job.id)
except Exception as e: except Exception as e:

View File

@ -16,8 +16,13 @@ class Task(Loadable):
return result return result
def _run(self, items): def _run(self, items):
for item in items:
item['steps'][self._id] = self._process(item)
return items return items
def _process(self, item):
return True
def run(task_name, items): def run(task_name, items):
result = Loader.by_id('tasks', task_name).run(items) result = Loader.by_id('tasks', task_name).run(items)
return result return result

25
lib/plugin/base/lib/GC.py Normal file
View File

@ -0,0 +1,25 @@
"""
Provides garbage collector
"""
from time import sleep
from lib import Service
class GarbageCollector(Service):
"""Simple GarbageCollector, removes items by filter periodically"""
def __init__(self, id, root):
super().__init__(self.__run, id, root)
self._logger.add_field('service', 'GC')
self._logger.add_field('vname', self.__class__.__name__)
def __run(self):
while self._running:
filter = {key: value for key, value in self.lcnf.get("if", {}).items()}
if filter:
items = self._data.find(filter=filter)
self._logger.info("Removing %s items", items.count())
self._data.remove(items)
else:
self._logger.error("Filter is empty!")
sleep(self.lcnf.get('delay', 600))

View File

@ -1,36 +1,51 @@
from lib.data import Source
from lib import Loader
import copy
from time import sleep from time import sleep
import os
import netaddr
import itertools import itertools
import random import random
import socket import socket
import struct import struct
import netaddr
import GeoIP
from lib.data import Source
class IPSource(Source): class IPSource(Source):
"""Base source for IPs, appends data.ip and data.geo"""
def __init__(self, thread, id, root): def __init__(self, thread, id, root):
super().__init__(thread, id, root) super().__init__(thread, id, root)
self._item.update({
self._item.update ({
'source': self._id,
'data': { 'data': {
'ip': None 'ip': None,
'geo': {
'country': None,
'city': None
}
} }
}) })
self.geo_ip = GeoIP.open(self.lcnf.get("geoip_dat", "/usr/share/GeoIP/GeoIPCity.dat"),
GeoIP.GEOIP_INDEX_CACHE | GeoIP.GEOIP_CHECK_CACHE)
def _geoip(self, item):
geodata = self.geo_ip.record_by_name(item['data']['ip'])
if geodata:
if 'country_code3' in geodata and geodata['country_code3']:
item['data']['geo']['country'] = geodata['country_code3']
if 'city' in geodata and geodata['city']:
item['data']['geo']['city'] = geodata['city']
def _prepare(self, item):
self._geoip(item)
class IPRange(IPSource): class IPRange(IPSource):
"""Provides IPs from ranges specified in file"""
def __init__(self, id, root): def __init__(self, id, root):
super().__init__(self.__run, id, root) super().__init__(self.__run, id, root)
self._iprange = [] self._iprange = []
self.load_ip_range() self.load_ip_range()
def load_ip_range(self): def load_ip_range(self):
"""Loads IP ranges from specified path"""
ip_range = [] ip_range = []
with open(self.lcnf.get('path'), "r") as text: with open(self.lcnf.get('path'), "r") as text:
for line in text: for line in text:
@ -41,18 +56,19 @@ class IPRange(IPSource):
else: else:
ip_range.append(netaddr.IPNetwork(diap[0])) ip_range.append(netaddr.IPNetwork(diap[0]))
except Exception as e: except Exception as e:
raise Exception("Error while adding range {}: {}".format(line, e)) raise Exception("Error while adding range %s: %s" % (line, e))
self._iprange = ip_range self._iprange = ip_range
def __run(self): def __run(self):
npos = 0 npos = 0 # network cursor
apos = 0 apos = 0 # address cursor
while self._running: while self._running:
try: try:
for _ in itertools.repeat(None, self.lcnf.get('oneshot', 100)): for _ in itertools.repeat(None, self.lcnf.get('oneshot', 200)):
item = self._create()
if self.lcnf.get('ordered', True): if self.lcnf.get('ordered', True):
# put currently selected element # put currently selected element
self._data.put(str(self._iprange[npos][apos])) item['data']['ip'] = str(self._iprange[npos][apos])
# rotate next element through networks and addresses # rotate next element through networks and addresses
if apos + 1 < self._iprange[npos].size: if apos + 1 < self._iprange[npos].size:
apos += 1 apos += 1
@ -66,13 +82,16 @@ class IPRange(IPSource):
else: else:
self.stop() self.stop()
else: else:
self._data.put(str(random.choice(random.choice(self._iprange)))) item['data']['ip'] = str(random.choice(random.choice(self._iprange)))
self._prepare(item)
self._data.put(item)
sleep(self.lcnf.get('delay', 0.5)) sleep(self.lcnf.get('delay', 0.5))
except Exception as e: except Exception as err:
self._logger.warn(e) self._logger.warn(err)
class RandomIP(IPSource): class RandomIP(IPSource):
"""Generates completely pseudorandom IPs"""
def __init__(self, id, root): def __init__(self, id, root):
super().__init__(self.__run, id, root) super().__init__(self.__run, id, root)
@ -81,11 +100,12 @@ class RandomIP(IPSource):
try: try:
items = [] items = []
for _ in itertools.repeat(None, self.lcnf.get("oneshot", 200)): for _ in itertools.repeat(None, self.lcnf.get("oneshot", 200)):
item = copy.deepcopy(self._item) item = self._create()
randomip = socket.inet_ntoa(struct.pack('>I', random.randint(1, 0xffffffff))) randomip = socket.inet_ntoa(struct.pack('>I', random.randint(1, 0xffffffff)))
item['data']['ip'] = str(randomip) item['data']['ip'] = str(randomip)
self._prepare(item)
items.append(item) items.append(item)
self._data.put(items) self._data.put(items)
sleep(self.lcnf.get("delay", 0.2)) sleep(self.lcnf.get("delay", 0.2))
except Exception as e: except Exception as err:
self._logger.warn(e) self._logger.warn(err)

View File

@ -1,5 +1,6 @@
from pymongo import MongoClient
from time import sleep from time import sleep
from pymongo import MongoClient
from lib.data import Storage from lib.data import Storage
class MongoStorage(Storage): class MongoStorage(Storage):
@ -27,11 +28,9 @@ class MongoStorage(Storage):
sleep(1) sleep(1)
return item return item
def _get_many(self, count, block, filter, update=None): def _get_many(self, count, block, filter):
if filter is None: if filter is None:
filter = {} filter = {}
self._logger.debug("%s, %s", filter, update)
items = self._coll.find(filter=filter, limit=count) items = self._coll.find(filter=filter, limit=count)
return items return items
@ -57,7 +56,6 @@ class MongoStorage(Storage):
def _update(self, items, update): def _update(self, items, update):
if update: if update:
filter = {'_id': {'$in': [item['_id'] for item in items]}} filter = {'_id': {'$in': [item['_id'] for item in items]}}
self._logger.debug("%s, %s", filter, update)
self._coll.update_many(filter, update, upsert=True) self._coll.update_many(filter, update, upsert=True)
else: else:
for item in items: for item in items:

View File

@ -1,8 +1,7 @@
from lib.data import Feed, Filter
import telebot
from time import sleep from time import sleep
import telebot
from lib.data import Feed
class TelegramFeed(Feed): class TelegramFeed(Feed):
"""Send data to Telegram chat""" """Send data to Telegram chat"""
@ -28,5 +27,5 @@ class TelegramFeed(Feed):
self._logger.debug("@%s: %s", chat_id, i['data']['message']) self._logger.debug("@%s: %s", chat_id, i['data']['message'])
tbot.send_message("@" + chat_id, i['data']['message']) tbot.send_message("@" + chat_id, i['data']['message'])
sleep(delay) sleep(delay)
except Exception as e: except Exception as err:
self._logger.warn(e) self._logger.warn(err)

View File

@ -1,4 +1,4 @@
from lib.exeq import Task from lib.exec import Task
from jinja2 import Environment, FileSystemLoader from jinja2 import Environment, FileSystemLoader

View File

@ -3,22 +3,19 @@
import subprocess import subprocess
import json import json
from jsoncomment import JsonComment from jsoncomment import JsonComment
from lib import Logger
import GeoIP
from Config import cnf
from lib.exeq import Task from lib.exec import Task
class MasScan: class MasScanTask(Task):
def __init__(self, bin_path='/usr/bin/masscan', opts="-sS -Pn -n --wait 0 --max-rate 5000"): """Provides data.ports for each of items scanned with masscan"""
self.bin_path = bin_path def scan(self, ip_list, port_list, bin_path, opts="-sS -Pn -n --wait 0 --max-rate 5000"):
self.opts_list = opts.split(' ') """Executes masscan on given IPs/ports"""
bin_path = bin_path
def scan(self, ip_list, port_list): opts_list = opts.split(' ')
port_list = ','.join([str(p) for p in port_list]) port_list = ','.join([str(p) for p in port_list])
ip_list = ','.join([str(ip) for ip in ip_list]) ip_list = ','.join([str(ip) for ip in ip_list])
process_list = [self.bin_path] process_list = [bin_path]
process_list.extend(self.opts_list) process_list.extend(opts_list)
process_list.extend(['-oJ', '-', '-p']) process_list.extend(['-oJ', '-', '-p'])
process_list.append(port_list) process_list.append(port_list)
process_list.append(ip_list) process_list.append(ip_list)
@ -29,45 +26,27 @@ class MasScan:
result = parser.loads(out) result = parser.loads(out)
return result return result
class MasScanTask(Task):
def __init__(self, id, root):
super().__init__(id, root)
def _run(self, items): def _run(self, items):
result = []
gi = GeoIP.open(cnf.get("geoip_dat", "/usr/share/GeoIP/GeoIPCity.dat"), GeoIP.GEOIP_INDEX_CACHE | GeoIP.GEOIP_CHECK_CACHE)
ip_list = [i['data']['ip'] for i in items] ip_list = [i['data']['ip'] for i in items]
port_list = self.lcnf.get("ports") port_list = self.lcnf.get("ports")
self._logger.debug("Starting scan, ip_list=%s, port_list=%s", ip_list, port_list) self._logger.debug("Starting scan, port_list=%s", port_list)
ms = MasScan(bin_path=self.lcnf.get('bin_path', "/usr/bin/masscan")) hosts = self.scan(ip_list=ip_list,
hosts = ms.scan(ip_list=ip_list, port_list=port_list) port_list=port_list,
bin_path=self.lcnf.get('bin_path', "/usr/bin/masscan"))
self._logger.debug(hosts) self._logger.debug(hosts)
hosts = {h['ip']: h for h in hosts} hosts = {h['ip']: h for h in hosts}
for item in items: for item in items:
data = {}
result = False
if hosts.get(item['data']['ip']): if hosts.get(item['data']['ip']):
data = { ports = [p['port'] for p in hosts[item['data']['ip']]['ports']]
'ports': [p['port'] for p in hosts[item['data']['ip']]['ports']], if 'ports' in item['data']:
'geo': { item['data']['ports'].extend(ports)
'country': None, else:
'city': None item['data']['ports'] = ports
} item['steps'][self._id] = True
} self._logger.debug("Found %s with open ports %s", item['data']['ip'], ports)
result = True else:
geodata = gi.record_by_name(item['data']['ip']) item['steps'][self._id] = False
if geodata:
if 'country_code3' in geodata and geodata['country_code3']:
data['geo']['country'] = geodata['country_code3']
if 'city' in geodata and geodata['city']:
data['geo']['city'] = geodata['city']
self._logger.debug(data)
item['data'].update(data)
item['steps'][self._id] = result
if result:
self._logger.debug("Found %s with open %s", item['data']['ip'], item['data']['ports'])
return items return items

View File

@ -1,125 +1,91 @@
# pylint: disable=E1101 """
Basic tasks for FTP services
"""
import ftplib import ftplib
import netaddr
from lib import Logger from lib.exec import Task
from Config import cnf
from lib.exeq import Task
class FTPConnectTask(Task):
def __init__(self, id, root):
super().__init__(id, root)
class FTPConnectTask(Task): # pylint: disable=too-few-public-methods
"""Tries to connect FTP service with various credentials"""
def _process(self, item): def _process(self, item):
data = {} ftp = ftplib.FTP(host=item['data']['ip'], timeout=self.lcnf.get('timeout', 30))
result = False
self.ftp = ftplib.FTP(host=item['data']['ip'], timeout=self.lcnf.get('timeout', 30))
try: try:
self._logger.debug('Trying anonymous login') self._logger.debug('Trying anonymous login')
self.ftp.login() ftp.login()
except ftplib.error_perm: except ftplib.error_perm as err:
pass self._logger.debug('Failed (%s)', err)
else: else:
self._logger.debug('Succeeded with anonymous') self._logger.info('Succeeded with anonymous')
data['username'] = 'anonymous' item['data']['username'] = 'anonymous'
data['password'] = '' item['data']['password'] = ''
result = True return True
self._logger.debug(data)
item['data'].update(data)
item['steps'][self._id] = result
return
if self.lcnf.get('bruteforce', False): if self.lcnf.get('bruteforce', False):
usernames = [] self._logger.debug('Bruteforce enabled, loading usernames and passwords')
passwords = [] usernames = [line.rstrip() for line in open(self.lcnf.get('usernames'), 'r')]
passwords = [line.rstrip() for line in open(self.lcnf.get('passwords'), 'r')]
with open(self.lcnf.get('logins'), 'r') as lfh:
for username in lfh:
usernames.append(username.rstrip())
with open(self.lcnf.get('passwords'), 'r') as pfh:
for password in pfh:
passwords.append(password.rstrip())
for username in usernames: for username in usernames:
for password in passwords: for password in passwords:
self._logger.debug('Checking %s', username + ':' + password)
try: try:
self.ftp.voidcmd('NOOP') self._logger.debug('Sending NOOP')
except IOError: ftp.voidcmd('NOOP')
self.ftp = ftplib.FTP(host=item['data']['ip'], timeout=self.lcnf.get('timeout', 30)) except IOError as err:
self._logger.debug('Trying %s' % (username + ':' + password)) self._logger.debug('IOError occured (%s), attempting to open new connection', err)
ftp = ftplib.FTP(host=item['data']['ip'], timeout=self.lcnf.get('timeout', 30))
try: try:
self.ftp.login(username, password) self._logger.debug('Trying to log in')
except ftplib.error_perm: ftp.login(username, password)
except ftplib.error_perm as err:
self._logger.debug('Failed (%s)', err)
continue continue
except:
raise
else: else:
self._logger.debug('Succeeded with %s' %(username + ':' + password)) self._logger.info('Succeeded with %s', username + ':' + password)
data['username'] = username item['data']['username'] = username
data['password'] = password item['data']['password'] = password
result = True return True
self._logger.info('Could not connect')
return False
self._logger.debug(data)
item['data'].update(data)
item['steps'][self._id] = result
return
self._logger.debug(data)
item['data'].update(data)
item['steps'][self._id] = result
def _run(self, items):
for item in items:
self._process(item)
return items
class FTPListFilesTask(Task):
def __init__(self, id, root):
super().__init__(id, root)
class FTPListFilesTask(Task): # pylint: disable=too-few-public-methods
"""Executes NLST to list files on FTP"""
def _process(self, item): def _process(self, item):
item['steps'][self._id] = False ftp = ftplib.FTP(host=item['data']['ip'],
self.ftp = ftplib.FTP(host=item['data']['ip'],
user=item['data']['username'], user=item['data']['username'],
passwd=item['data']['password']) passwd=item['data']['password'])
filelist = self.ftp.nlst() filelist = ftp.nlst()
try: try:
self.ftp.quit() ftp.quit()
except: except ftplib.Error:
# that's weird, but we don't care
pass
try:
if len(filelist) == 0 or filelist[0] == "total 0":
item['data']['filter'] = "Empty server"
except IndexError:
pass pass
item['data']['files'] = [] item['data']['files'] = []
for fileName in filelist: for filename in filelist:
item['data']['files'].append(fileName) item['data']['files'].append(filename)
item['steps'][self._id] = True return True
class FTPFilterFilesTask(Task): # pylint: disable=too-few-public-methods
"""Sets data.filter if FTP contains only junk"""
def _process(self, item):
junk_list = ['incoming', '..', '.ftpquota', '.', 'pub']
files = item['data']['files']
def _filter(self, item):
item['data']['filter'] = False item['data']['filter'] = False
if len(item['data']['files']) == 0:
item['data']['filter'] = "Empty"
elif len(item['data']['files']) < 6:
match = 0
for f in 'incoming', '..', '.ftpquota', '.', 'pub':
if f in item['data']['files']:
match += 1
if match == len(item['data']['files']):
item['data']['filter'] = "EmptyWithSystemDirs"
if item['data']['filter'] == False:
item['steps'][self._id] = True
def _run(self, items): try:
for item in items: if not files or files[0] == "total 0":
self._process(item) item['data']['filter'] = "Empty"
if self.lcnf.get('filter', False): except IndexError:
self._filter(item) pass
return items
if 0 < len(files) <= len(junk_list): # pylint: disable=C1801
match_count = 0
for filename in junk_list:
if filename in files:
match_count += 1
if match_count == len(files):
item['data']['filter'] = "EmptyWithBloatDirs"
return True

View File

@ -1,19 +1,15 @@
"""
Basic tasks for Gopher services
"""
import socket import socket
from Config import cnf from lib.exec import Task
from lib.exeq import Task class GopherFindTask(Task): # pylint: disable=too-few-public-methods
"""Tries to connect Gopher service"""
class GopherFindTask(Task): @staticmethod
def __init__(self, id, root): def _recv(sck):
super().__init__(id, root)
def _run(self, items):
for item in items:
self._process(item)
return items
def _recv(self, sck):
total_data = [] total_data = []
while True: while True:
data = sck.recv(2048) data = sck.recv(2048)
@ -23,7 +19,6 @@ class GopherFindTask(Task):
return ''.join(total_data) return ''.join(total_data)
def _process(self, item): def _process(self, item):
item['steps'][self._id] = False
sock = socket.socket() sock = socket.socket()
sock.settimeout(self.lcnf.get('timeout', 20)) sock.settimeout(self.lcnf.get('timeout', 20))
sock.connect((item['data']['ip'], int(70))) sock.connect((item['data']['ip'], int(70)))

View File

@ -0,0 +1,60 @@
from lib.exec import Task
from io import BytesIO
import json
import time
import bs4
import requests
import urllib3
from PIL import Image
from bson.binary import Binary
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
from selenium.webdriver.common.proxy import Proxy, ProxyType
import zlib
import netaddr
class HTTPFindTask(Task):
def __init__(self, id, root):
super().__init__(id, root)
def _process(self, item):
urllib3.disable_warnings()
response = requests.get(url='http://%s:%s/' % (self._host['ip'], self._host['port']),
timeout=cnf.stalker.HTTP.timeout,
verify=False)
if response.status_code in [400, 401, 403, 500]:
raise self.PipelineError("Bad response")
self._host['data']['response'] = {}
self._host['data']['response']['code'] = response.status_code
self._host['data']['response']['text'] = response.text
self._host['data']['response']['content'] = response.content
self._host['data']['response']['encoding'] = response.encoding
self._host['data']['response']['headers'] = response.headers
encoding = response.encoding if 'charset' in response.headers.get('content-type', '').lower() else None
soup = bs4.BeautifulSoup(response.content, "html.parser", from_encoding=encoding)
if soup.original_encoding != 'utf-8':
meta = soup.select_one('meta[charset], meta[http-equiv="Content-Type"]')
if meta:
if 'charset' in meta.attrs:
meta['charset'] = 'utf-8'
else:
meta['content'] = 'text/html; charset=utf-8'
self._host['data']['response']['text'] = soup.prettify() # encodes to UTF-8 by default
title = soup.select_one('title')
if title:
if title.string:
title = title.string
else:
title = ""
else:
title = ""
self._host['data']['title'] = title

View File

@ -0,0 +1,10 @@
import tempfile
from lib.exec import Task
class VNCFindTask(Task): # pylint: disable=too-few-public-methods
"""Tries to connect FTP service with various credentials"""
def _process(self, item):
fd, temp_path = tempfile.mkstemp()
print(fd, temp_path)

View File

@ -0,0 +1,19 @@
#code{{data['response']['code']}}
server = self._host['data']['response']['headers'].get('Server', None)
{% if server != none %}
" Server: #%s\n" % server
else:
"\n"
if self._host['data']['title']:
"Title: %s\n" % self._host['data']['title']
"Geo: %s/%s\n" % (self._host['data']['geo']['country'], self._host['data']['geo']['city'])
"http://%s:%s/\n" % (self._host['ip'], self._host['port'])
"#http_" + str(int(netaddr.IPAddress(self._host['ip'])))

View File

@ -1,10 +1,10 @@
import logging import logging
from Config import cnf as config
import importlib import importlib
import sys, os import sys, os
from Config import cnf as config
class Logger(logging.Logger): class Logger(logging.Logger):
"""Logger. standard logging logger with some shitcode on the top""" """Logger. standard logging logger with some shitcode on the top"""
def __init__(self, name): def __init__(self, name):
@ -95,8 +95,9 @@ class Loader:
return getattr(result, name) return getattr(result, name)
@classmethod @classmethod
def by_id(cls, section, id): def by_id(cls, section, id) -> Loadable:
"""Returns instantiated object of class provided in configuration"""
# prepares Loader for certain package # prepares Loader for certain package
l = cls(config.get(section).get(id).get('package')) loader = cls(config.get(section).get(id).get('package'))
# loads class from this package and returns instantiated object of this class # loads class from this package and returns instantiated object of this class
return l.get(config.get(section).get(id).get('service'))(id=id, root=config.get(section)) return loader.get(config.get(section).get(id).get('service'))(id=id, root=config.get(section))

View File

@ -1,5 +1,4 @@
#!/usr/bin/python3 #!/usr/bin/python3
# -*- coding: utf-8 -*-
import time import time
@ -20,22 +19,24 @@ class Core:
self._services.append(service) self._services.append(service)
def start(self): def start(self):
"""Starts all loaded services"""
self.logger.info("Starting") self.logger.info("Starting")
for service in self._services: for service in self._services:
service.start() service.start()
self.logger.info("Started") self.logger.info("Started")
def stop(self): def stop(self):
"""Stops all loaded services"""
self.logger.info("Stopping Core") self.logger.info("Stopping Core")
for service in self._services: for service in self._services:
service.stop() service.stop()
self.logger.info("Stopped") self.logger.info("Stopped")
if __name__ == '__main__': if __name__ == '__main__':
core = Core() CORE = Core()
core.start() CORE.start()
try: try:
while True: while True:
time.sleep(1) time.sleep(1)
except KeyboardInterrupt: except KeyboardInterrupt:
core.stop() CORE.stop()