| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
|
|
|
|
|
|
|
|
| |
Implement a shared memory based authentication cache. It's a simple
local cache indexed by IP-address, and keeps track of that IP's
auth info such as username, allowed categories and timeouts. This
provides basis for captive portal, per-user definable category
restrictions and implementation of soft blocks (block which can
be overridden by user by clicking a button on the blocked page).
|
|
|
|
|
| |
Will implement 'captive portal' style authentication with separate
DB later.
|
|
|
|
| |
will need authentication db later too.
|
| |
|
|
|
|
|
|
|
| |
Keep the modifications (which are needed for key lookup) inside the
lookup routine. This includes e.g. lower casing the URL. This way
can pass the exact original request string to our block page script.
This also changes the way 'www123.' is stripped from the request.
|
|
|
|
|
|
| |
Ability to force reauthentication (HP ProCurve specific) for the
switch port to which we traced the IP. This works currently only with
the HP WebAuth scheme (should be possible with MAC auth scheme too).
|
|
|
|
| |
and pass the denied url too.
|
|
|
|
| |
properly match them against db data.
|
|
|
|
|
| |
Properly embed the ipv4 address in database now. Teach filter
to understand the two new reserved component id's.
|
|
|
|
|
| |
Should be faster in most cases to write two null words than to
copy them around.
|
|
|
|
|
| |
Lower case the dns part of url. Also skip "www123" and similar
entries when determining if path components should be matched.
|
|
|
|
|
|
|
|
|
|
| |
Implement squid redirect protocol. It implements the "concurrent"
version even though the algorithm is non-blocking. Doing this can
reduce the amount of read system calls on busy system.
Minimum command line based configuration for banning specific
categories and specifying the redirect site. Will probably have
to add some sort of config file system later.
|
|
|
|
|
|
|
|
| |
Fixes has sub domains/paths hints to be correct. www<number> as
first domain entry matching now checks it won't remove second level
domain names.
And the filter code now looksup path components from the db.
|
|
|
|
|
|
| |
So we don't need explicit null terminator in most cases saving
space. It will also speed up comparisons as getting string blob is
now constant time (no strlen needed).
|
|
|
|
|
| |
Analysing of the url host part, some simple tests. Not usable as
squid filter yet.
|
|
|
|
| |
store the names of categories to database
|
|
|
|
| |
u_int32_t is not standard, use uint32_t from stdint.h instead.
|
|
|
|
| |
Normalizing macro names to upper case and extending functionality.
|
|
|
|
| |
it's useful in other binaries than squark-auth too.
|
|
|
|
|
|
|
|
|
| |
Implement basics of squarkdb which will be used by squark-filter
to categorize URIs. Implementation is based on libcmph and uses
file format suitable to be mmap:ed from squark-filter.
Lua code is used to create the squark database from standard
domain / url blacklists.
|
|
|
|
|
|
| |
This allows setting SNMPv3 configuration via the standard config
files. If SNMP community is given from command line, we fallback
to SNMPv2c mode.
|
|
|
|
| |
basics of the helper module explained.
|
|
Basic functionality implemented.
|