DeepSurface: Tasks

Documentation
Installation Guide
Overview
Let DeepSurface Host For You
Getting Started
System Requirements
Self Hosted Quick Start - Installing to Cloud Platforms
Self Hosted - Installation Using an OVA
Registration, Package Installation, and Initialization
First Steps After Initialization of the Console
Deployment Options
Main and Subordinate Consoles
Agent-Based Deployment
User Managed Scan Deployment
Credentialed Scanning Deployment
Mixed Environment
Deployment Tools
Active Directory Group Policy
Microsoft Endpoint Configuration Manager (part of InTune)
Tanium Deploy
HCL BigFix
Ivanti
Virtual Machines
VMWare
Virtual Box
VirtualBox Guest Additions
AWS EC2 (BYOL)
AWS EC2 (Usage Based)
Azure Cloud
Google Cloud
Additional Items to Consider
Main Console Server Certificates
LDAP
TOFU
Clock Sync
DeepSurface Commands
Multiple Vulnerability Sources
API Documentation
User Guide
Reporting
Dashboards
Exports
Risk Insight
Hosts
Patches
Vulnerabilities
Vulnerability Instances
Users
Remediation Workflow Manager
Plans
Settings
Integrations
Workflow
Exporting
Accepted Risk Plans
Accepted Risk Workflow
Explore
Model
Paths
Activity
Tasks
Configuration Alerts
Scan Logs
Notification Settings
Scanning
Status
Agents
User Managed
Credentialed Scanning Settings
Credentials
Scan Groups
General Settings
Cloud Scanning
Network Connectivity
Subordinates
Vulnerability Sources
Setup
Sensitive Assets: Polices
Sensitive Assets: Manual
Admin Settings
SMTP Settings
Certificates
Outbound Proxy
Authentication Providers
Users
Tags
Integrations Guide
Vulnerability Sources
CrowdStrike Spotlight
SentinelOne
Carbon Black Cloud
Microsoft Defender for Endpoint
Wazuh
Lansweeper Cloud
Nessus API
Tenable.io API
Security Center/Tenable.sc API
Rapid7 InsightVM API
Qualys API
Nozomi Guardian
Eclypsium
AWS Inspector
Remediation
Jira Software
Tanium (BETA)
Authentication Providers
LDAP (Active Directory)
SAML (Azure Active Directory)
SAML (Google)
SAML (Okta)
PAM
CyberArk
Delinea (Thycotic)
Microsoft LAPS
Security Guide
Firewall Configuration
Base Network Requirements
Agent Network Requirements
Credentialed Scanning Network Requirements
API Network Requirements
How DeepSurface Scans Work
Domain (LDAP) Scanning
Host Scanning Routine
Reasons for the Administrative Access Requirement
Endpoint Protection Considerations
Other Items
Scope of Data Storage and Retention
IPS/IDS Considerations
Logging
Resetting the DSADMIN password
Product Information
Changelogs
Open source Licenses
End User License Agreement (EULA)

Acitivity > Tasks is where you can manually kick off any of the available background tasks that run in DeepSurface. This is also a helpful area to trouble shoot any tasks that have run and get information about what happened during a particular task and when it was last run. At present, the five task types are available and each will be covered in its own detailed section.

General Overview

Each of the available tasks do very different things within the application, but each shares a lot of similar DNA. When first visiting this interface, you will be greeted with something that looks like the following:

Tasks

Each available task is represented by a card. The name of a task is in the top left (along with a handy contextual help icon). Below the name of the task will be any available information about the most recent time this task ran in the system. Scheduled task and job history will also be shown for a given task if available (not all tasks have the ability to be scheduled or keep track of job history).

The far right for each task is a button that allows you to manually start any of the available tasks. Each task will have different follow-up information needed in order to run the task. The information needed is different for each task and will be covered in greater detail in the documentation for that task. One constant, however, is the option to automatically "Run next task in sequence when finished?".

sequence

This option is checked by default for each task. It is helpful to think of the tasks as a sequential series of tasks that naturally flow into the next one. The order of the available tasks on this page are no accident. One task naturally leads to the next one and is necessary for subsequent tasks to be run. Therefore, if left checked, kicking off any of the tasks on the page will automatically start the next task below it and so on, until the bottom task (Risk Analysis and Prioritization) has run. Some users find it useful to uncheck this box if they really just want to do one specific task in the system, without having to wait for everything else that follows in the sequence.

Task Status

The status of a given task is shown to the immediate right of the task name if available. If a task has not been run, then no status will be shown. Possible status options are:

Sometimes a task will also have error messages to show you about the most recent job that had an error. To view the message, click on the alert notification icon to the right of the button to kick off the task. A task in an error state will look something like this:

task error

Job History

The Credentialed Scan and Import Vulnerability Source Data tasks have the ability to view historical information about tasks that have been run. This can be useful for determining when a given task succeeded or failed. To view the history for either of these tasks, click on the "View History" button and you will see something like this:

job history

Rule Engine Data Feed

The rule engine and data feed fetches the latest data on patches, known vulnerabilities, and "rules" (which allow DeepSurface to associate identified vulnerabilities with specific aspects of your architecture). This happens daily in the system and is necessary for the DeepSurface product to have the latest information available to you. Like any other tasks, this can be kicked off manually within this interface, but can also be configured to happen at a particular time every day.

To configure the specific time of day that this task runs, head to Setup > General Settings > Admin in the application menu and change the time entered in TIME OF DAY FOR RULE FEED UPDATE field.

schedule feed

Credentialed Scan

An credentialed scan collects information from each domain and host identified in the reconnaissance phase. This is the most intensive data collection scan performed by DeepSurface. When performin an credentialed scan, you can have it run for any number of Scan Groups. If no scan groups have been configured, then this task is not yet available.

credentialed options

Aside from selected which scan groups are to be included in this credentialed scan, you also have the option to tell DeepSurface to "Force re-scan of recently scanned hosts?". By default, this option is unchecked to avoid unnecessary scanning but some customers find it useful to check this option if they know something has changed that DeepSurface needs to be aware of.

Like the rule feed, this task can also be scheduled to happen automatically. These scans can be configured to happen on a per scan group basis. You can have multiple scheduled scans happen at any interval you like (daily, weekly, monthly) and can even have multiple schedule scans happen for a single scan group. To schedule a scan for a scan group, head to scanning > Credentialed > Scan Groups and edit any of your configured scan groups. While editing, click on the second tab of the edit modal and add as many schedules as you want for a scan group. Once scheduled, the schedule will be visible in the scan group and in the card for this task.

schedule scan group 1

schedule scan group 2

schedule scan group 3

Process Scan Queue

If you manually stopped an credentialed scan early, or want to manually sync any outstanding scan information from deployed agents, you can choose to process that queue of hosts manually. There are no additional options needed other than unchecking the option to run the next task in the sequence.

Import Vulnerability Source Data

The import vulnerability source data task imports data from one or more third-party vulnerability sources and merges this information into the DeepSurface threat model. The only additional information needed to kick off this task is letting DeepSurface know which vulnerability sources you would like to import data from. As long as your vulnerability sources are configured correctly, DeepSurface will then import the vulnerability data through the API using the credentials provided.

Risk Analysis and Prioritization

Finally, the risk analysis and prioritization task is the deep offline analysis phase which finds all scenarios where an attacker could leverage identified vulnerabilities in realistic attacks. This information is then used to prioritize risks and generate multiple reports and views. This analysis is what powers most of the DeepSurface interface. The only additional information needed is to know if you want to "Re-run analysis on models even if no underlying changes have occured?" This option is unchecked by default, but can be checked if you want to force the system to run a full analysis on aspects that you know may have changed.