-
Notifications
You must be signed in to change notification settings - Fork 64
Plugin Scripts
The Sal server can embed a script in your plugin that gets sent to clients on checkin, and will be run during postflight (as of Sal 2.5.0). The information sent back to the server can either be refreshed every time the client checks in, or if can be stored up to the retention limit (historical data
).
Simply create a directory called scripts
inside your plugin and put files with a proper shebang in there. Anything in this directory will be sent to the client when the plugin is activated on next checkin.
The script should write out your information locally to /usr/local/sal/plugin_results.plist
. The plist should look like the below - it is the plugin author's responsibility to ensure the correct format is preserved by their script, and appended to any other plugin's data. The plugin's name should be set by a string value associated with the plugin
key, a historical
boolean can be enabled (or leave it out for it to default to False) and data should be a dictionary containing keys and values to be stored - all of the values for which should be strings. Anything that is sent that is not a string will be attempted to be cast to a string, or it will be discarded/ignored by the server.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<array>
<dict>
<key>plugin</key>
<string>MunkiInfo</string>
<key>historical</key>
<false/>
<key>data</key>
<dict>
<key>SoftwareRepoURL</key>
<string>http://munki</string>
<key>ClientIdentifier</key>
<string>some_identifier</string>
...
</dict>
</dict>
</array>
</plist>
As previously mentioned, of the two types of data storage, the default is you want the most recent snapshot and discard anything older once safely updated. By marking your data as historical, it will be stored in the database until the retention limit configured in Settings
is reached (by default 180 days).
Plugins are passed a queryset of Machines
in the arguments to the widget_content
method. Knowledge of Django's queryset API is necessary to have a full understanding of what is happening, however, the following examples demonstrate the approach to making use of a plugin's results.
The Sal MunkiInfo plugin demonstrates a method for drilling down through the related fields of the Machine model to count the different possible values, as well as retrieve string values:
class MunkiInfo(IPlugin):
#...
def widget_content(self, page, machines=None, theid=None):
# HTTP only machines
http_only = machines.filter(
pluginscriptsubmission__plugin='MunkiInfo',
pluginscriptsubmission__pluginscriptrow__pluginscript_name='SoftwareRepoURL',
pluginscriptsubmission__pluginscriptrow__pluginscript_data__startswith='http://').count()
# HTTPS only machines
https_only = machines.filter(
pluginscriptsubmission__plugin='MunkiInfo',
pluginscriptsubmission__pluginscriptrow__pluginscript_name='SoftwareRepoURL',
pluginscriptsubmission__pluginscriptrow__pluginscript_data__startswith='https://').count()
# Distinct Repo URLs
repo_urls = machines.filter(
pluginscriptsubmission__plugin='MunkiInfo',
pluginscriptsubmission__pluginscriptrow__pluginscript_name='SoftwareRepoURL').annotate(
pluginscript_data=F(
'pluginscriptsubmission__pluginscriptrow__pluginscript_data')
).values(
'pluginscript_data').annotate(count=Count('pluginscript_data')).order_by('pluginscript_data')
#... further queries follow
Starting in Sal 3.0.0, when your data is submitted, Sal will attempt to save it in various native fields, so you can perform more efficient queries. These fields are:
pluginscript_data_string = models.TextField(blank=True, null=True, db_index=True)
pluginscript_data_int = models.IntegerField(default=0)
pluginscript_data_date = models.DateTimeField(blank=True, null=True)
- Brute force protection
- LDAP integration
- Active Directory integration
- API
- Usage reporting
- License Management
- Maintenance
- Search
- Troubleshooting
- SAML
- IAM Authentication for AWS RDS Postgres
- Docker
- Ubuntu 14.04
- Ubuntu 16.04
- RHEL 7
- Kubernetes
- Heroku?