Commit graph

29 commits

Author SHA1 Message Date
af5552ef75
new: [api] /lookup/sha256 api endpoint added 2021-11-19 07:26:50 +01:00
a78a99ec44
new: [api] add a hashlookup:parent-total which indicates the cardinality of the parents
This can be used for the new API endpoint to paginate over large set of
parents.
2021-10-31 09:04:25 +01:00
9e79d8ecc7
chg: [server] add auth header in pub-sub 2021-10-25 21:39:38 +02:00
15e6f461d4
fix: [api/stats] existing hash value from nx removed 2021-09-11 00:02:34 +02:00
d6fe136421
chg: [api/info] now return the default stats from the new importer
TODO: add the diff with additional sources and key numbers
2021-09-09 07:34:37 +02:00
ac7ec00c97
chg: [config] don't flush db by default 2021-09-06 16:21:34 +02:00
aa58ad8152
chg: [stats/top] remove recently added hashes from previously nx hash 2021-09-05 21:59:20 +02:00
6461b91d55
fix: [api/lookup/sha1] missing parents bug 2021-09-05 07:36:44 +02:00
c470201fd8
fix: [api:md5/sha1] large set of parents are now limited and give a random selection
TODO: mainly empty files and similar - warning-lists should be added
2021-09-01 21:36:39 +02:00
74c0e8c8c2
fix: [api/md5] if there is already more data in default SHA1 we use that one and not the fall-back MD5 lookup 2021-09-01 19:56:50 +02:00
fe04f1e7dd
chg: [api/md5] data sources with MD5 only hashes 2021-09-01 19:37:21 +02:00
bb29fcb25c
chg: [api] md5 lookup updated to allow MD5 only records 2021-09-01 19:26:58 +02:00
c17dbff6a6
new: [api:stats/top] Add a new optional entry to point to get the top 100 of most queried hashes (existing and non-existing) 2021-08-29 14:06:35 +02:00
4419052c4f
fix: [api:bulk] add proper check of MD5 and SHA1 value before further processing 2021-08-29 12:25:43 +02:00
a14e5aedf1
new: [api:bulk] add support for pub-sub channel of existing and
non-existing hashes
2021-08-29 11:52:07 +02:00
a8b951cf83
chg: [server] add children hashes if these exist 2021-08-24 14:53:43 +02:00
02a675d9ea
fix: [api] typo fixed 2021-08-22 23:37:05 +02:00
a17cd38595
chg: [api] lookup add parent details 2021-08-22 23:23:52 +02:00
27aa7c4034
fix: [api] fix ttl missing bug 2021-08-22 16:48:06 +02:00
284e4719c7
new: [feature] session handling added
A user can now create a session, assign lookup results to a session
and retrieve the lookup session results in one shot.

This partially implement feature requested in issue #2 to support
DFIR sessions.

Thanks to Koen Van Impe for the idea.
2021-08-14 10:32:00 +02:00
31ac93e764
new: [pub-sub] add a pub-sub functionality for searched hashes in two
different channels:
- nx: non-existing hash value
- exist: existing hash value
2021-08-13 22:42:41 +02:00
72b462b5ea
new: [statistics] add an optional statistic option in the server to have
a sorted set of hashes matching and non-matching.
2021-08-13 22:13:25 +02:00
fc11f3dd88
chg: [poc-import] JSON updated to add legacy and JSON file jqified 2021-07-29 11:58:25 +02:00
ef60dd73a9
chg: [requirements] remove standard lib python modules 2021-07-29 11:55:26 +02:00
Koen Van Impe
ca910ff22f Support for import of NSRL datasets in ISO and ZIP format 2021-07-29 10:46:59 +02:00
Koen Van Impe
d280216361 Merge branch 'main' of https://github.com/cudeso/hashlookup-server into main 2021-07-27 18:37:16 +02:00
Koen Van Impe
0edb33ed82 PoC for streamlining import
PoC to better streamline the import of NSRL data.
Still requires some work but basic concept works.
Currently only tested with Android
2021-07-27 18:37:12 +02:00
5932226ece
chg: [install] kvrocks default repo and latest release 2021-07-17 07:59:05 +02:00
a4b3b7ba60
new: [hashlookup-server] initial import of the code
- This includes a simple HTTP server for doing bulk and lookup of hashes.
- A simple DNS server to do lookup via DNS
- Various import script for NSRL

This works on a test instance.

TODO:

- Automatic script for NSRL download and import
- Bloomfilter export
- Improved documentation
2021-07-15 17:49:52 +02:00