a78a99ec44
new: [api] add a hashlookup:parent-total
which indicates the cardinality of the parents
...
This can be used for the new API endpoint to paginate over large set of
parents.
2021-10-31 09:04:25 +01:00
9e79d8ecc7
chg: [server] add auth header in pub-sub
2021-10-25 21:39:38 +02:00
b4c539114f
Merge branch 'main' of github.com:adulau/hashlookup-server into main
2021-09-24 08:17:09 +02:00
7c174204f7
fix: [dns] records are now too large for single hashlookup record
...
reduce to some fields and the HTTP interface the fall-back to get more
info
2021-09-24 08:16:08 +02:00
15e6f461d4
fix: [api/stats] existing hash value from nx removed
2021-09-11 00:02:34 +02:00
d6fe136421
chg: [api/info] now return the default stats from the new importer
...
TODO: add the diff with additional sources and key numbers
2021-09-09 07:34:37 +02:00
ac7ec00c97
chg: [config] don't flush db by default
2021-09-06 16:21:34 +02:00
aa58ad8152
chg: [stats/top] remove recently added hashes from previously nx hash
2021-09-05 21:59:20 +02:00
6461b91d55
fix: [api/lookup/sha1] missing parents bug
2021-09-05 07:36:44 +02:00
c470201fd8
fix: [api:md5/sha1] large set of parents are now limited and give a random selection
...
TODO: mainly empty files and similar - warning-lists should be added
2021-09-01 21:36:39 +02:00
74c0e8c8c2
fix: [api/md5] if there is already more data in default SHA1 we use that one and not the fall-back MD5 lookup
2021-09-01 19:56:50 +02:00
fe04f1e7dd
chg: [api/md5] data sources with MD5 only hashes
2021-09-01 19:37:21 +02:00
5f93d3e7ab
Merge branch 'main' of github.com:adulau/hashlookup-server into main
2021-09-01 19:27:27 +02:00
bb29fcb25c
chg: [api] md5 lookup updated to allow MD5 only records
2021-09-01 19:26:58 +02:00
12b9642610
Update DATABASE.md
2021-08-31 22:47:44 +02:00
413191c712
Merge pull request #8 from wllm-rbnt/fix
...
Fix jq parse error on special chars
2021-08-31 21:43:45 +02:00
William Robinet
09b56135d1
Fix jq parse error on special chars
2021-08-31 12:11:50 +02:00
c17dbff6a6
new: [api:stats/top] Add a new optional entry to point to get the top 100 of most queried hashes (existing and non-existing)
2021-08-29 14:06:35 +02:00
4419052c4f
fix: [api:bulk] add proper check of MD5 and SHA1 value before further processing
2021-08-29 12:25:43 +02:00
a14e5aedf1
new: [api:bulk] add support for pub-sub channel of existing and
...
non-existing hashes
2021-08-29 11:52:07 +02:00
ecc2baf2f9
chg: [doc] added keys related to packages
2021-08-24 22:53:53 +02:00
cea5524490
chg: [req] added
2021-08-24 14:57:15 +02:00
1a94f6c246
fix: [req] moved
2021-08-24 14:56:56 +02:00
a8b951cf83
chg: [server] add children hashes if these exist
2021-08-24 14:53:43 +02:00
e9fb182c82
Update DATABASE.md
2021-08-23 09:17:20 +02:00
02a675d9ea
fix: [api] typo fixed
2021-08-22 23:37:05 +02:00
a17cd38595
chg: [api] lookup add parent details
2021-08-22 23:23:52 +02:00
8504b492f7
chg: [doc] children added
2021-08-22 19:20:41 +02:00
670e3dd52c
chg: [doc] parent added
2021-08-22 19:18:21 +02:00
27aa7c4034
fix: [api] fix ttl missing bug
2021-08-22 16:48:06 +02:00
ab1b74e109
chg: [doc] add the default keys used
2021-08-22 16:20:55 +02:00
3a8185dd33
Merge branch 'main' of github.com:adulau/hashlookup-server into main
2021-08-22 16:14:02 +02:00
99609502cb
chg: [doc] document basic storage format
2021-08-22 16:13:41 +02:00
a0c858884a
Update README.md
2021-08-18 10:21:07 +02:00
d43a693a7d
Typo fixed
2021-08-18 10:17:22 +02:00
284e4719c7
new: [feature] session handling added
...
A user can now create a session, assign lookup results to a session
and retrieve the lookup session results in one shot.
This partially implement feature requested in issue #2 to support
DFIR sessions.
Thanks to Koen Van Impe for the idea.
2021-08-14 10:32:00 +02:00
e87e8a412d
chg: [doc] add the import stat key
2021-08-13 22:56:06 +02:00
31ac93e764
new: [pub-sub] add a pub-sub functionality for searched hashes in two
...
different channels:
- nx: non-existing hash value
- exist: existing hash value
2021-08-13 22:42:41 +02:00
72b462b5ea
new: [statistics] add an optional statistic option in the server to have
...
a sorted set of hashes matching and non-matching.
2021-08-13 22:13:25 +02:00
fc11f3dd88
chg: [poc-import] JSON updated to add legacy and JSON file jqified
2021-07-29 11:58:25 +02:00
ef60dd73a9
chg: [requirements] remove standard lib python modules
2021-07-29 11:55:26 +02:00
c54375c28b
Merge pull request #4 from cudeso/main
...
PoC to better streamline the import of NSRL data.
2021-07-29 11:45:13 +02:00
Koen Van Impe
ca910ff22f
Support for import of NSRL datasets in ISO and ZIP format
2021-07-29 10:46:59 +02:00
Koen Van Impe
d280216361
Merge branch 'main' of https://github.com/cudeso/hashlookup-server into main
2021-07-27 18:37:16 +02:00
Koen Van Impe
0edb33ed82
PoC for streamlining import
...
PoC to better streamline the import of NSRL data.
Still requires some work but basic concept works.
Currently only tested with Android
2021-07-27 18:37:12 +02:00
5932226ece
chg: [install] kvrocks default repo and latest release
2021-07-17 07:59:05 +02:00
49bf5be12b
Merge pull request #1 from cudeso/main
...
Update README.md
2021-07-16 17:06:38 +02:00
Koen Van Impe
97e2d2c8aa
Update README.md
2021-07-16 16:25:03 +02:00
a4b3b7ba60
new: [hashlookup-server] initial import of the code
...
- This includes a simple HTTP server for doing bulk and lookup of hashes.
- A simple DNS server to do lookup via DNS
- Various import script for NSRL
This works on a test instance.
TODO:
- Automatic script for NSRL download and import
- Bloomfilter export
- Improved documentation
2021-07-15 17:49:52 +02:00