Skip to content
/ opticka Public

Opticka is an experiment manager built on top of the Psychophysics toolbox (PTB) for MATLAB. It runs experimental tasks using flexible state machine logic and easily does dynamic methods-of-constants type experiments with full behavioural control. It uses a class system to create simple to use visual stimuli using experimenter friendly units. Op…

License

Notifications You must be signed in to change notification settings

iandol/opticka

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Opticka: Behavioural Experiment Manager

DOI Open in Visual Studio Code Maintenance GPLv3 license

Opticka is an object-oriented framework with optional GUI for the Psychophysics toolbox (PTB), allowing full experimental presentation of complex visual or other stimuli. It is designed to work on Linux, macOS or Windows. It interfaces via strobed words and/or ethernet for recording neurophysiological and behavioural data. Full behavioural task control is available by use of a Finite State-Machine controller, in addition to simple method of constants (MOC) experiments. Opticka uses the TCP interface for Eyelink, Tobii Pro and iRecHS2 eyetrackers affording better control, reliability and data recording over depending on analog voltage output (we don't need a DAQ card for eye data). The base classes can be used without the need to run the GUI (see optickatest.m for an example), and plug-n-play stimuli provide a unified interface (setup, animate, draw, update, reset) to integrate into other PTB routines. Opticka's methods take care of all the background geometry and normalisation, meaning stimuli are much easier to use than “raw” PTB commands alone. Analysis routines are also present, eye data can be parsed into trials for any supported eyetracker, and for taking e.g. Plexon files (.PL2 or .PLX), Eyelink files (.EDF), and behavioural responses and parsing them into a consistent structure, interfacing directly with Fieldtrip for further spike, LFP, and spike-LFP analysis. Opticka is more modular and affords much better stimulus control (most stimuli are optimised OpenGL with advanced control thanks to PTB) than e.g. MonkeyLogic.

Sample hardware setup

The diagram below shows a sample Opticka configuration setup. Note that the eyetracker, display, synchronisation and electrophysiology systems can be swapped for other hardware (see the list below). While I prefer using a Display++ or a DataPixx/ViewPixx to guarantee temporal fidelity, you can use either a LabJack or Arduino/XIAO for synchronisation (where a photodiode becomes more important, which we integrate as an option):

Example hardware setup to run Opticka

GUI

A GUI can be used to control the hardware, stimuli and variables needed for both method of constant (MOC) and more complex behavioural tasks that use the state machine. GUI supports protocol files that are useful for cases where staff running an experiment are not themselves programmers, and when you need to change experiment or stimulus parameters quickly between runs: you make protocol files for each task, then load and run them as needed. The GUI is not required to utilise the underlying classes…

o = opticka; %==run the GUI, returns an object 'o' for introspection from the command window...

Opticka Screenshot

State machine control

For more complex behavioural tasks, a state machine is used. You can still edit visual stimuli and task variables in the GUI, and then edit a StateInfo.m file that specifies states (like paused, prefixation, stimulus, correct, breakfix etc.) to be run and for each state which methods/functions are executed (cell arrays of functions that run on ENTER, WITHIN & EXIT of states). States can switch (TRANSITION) based on logic, for example a particular saccade or fixation or button responses to transition to a correct state… See an example StateInfo.m file here; and you can test a minimal generic state machine run using:

sM = stateMachine;
runDemo(sM);

Hardware currently supported

  • Display & Digital I/O: high quality display (high bit depths, great colour management) and microsecond precise frame-locked digital I/O: Display++ by CRS.
  • Display & Digital I/O: high quality display (high bit depths) and easy-to-use microsecond precise digital I/O: DataPixx / ViewPixx / ProPixx.
  • Display: any normal monitor; remember that PTB can support 10bits and higher output, steroscopic display, HDR output etc.
  • Digital I/O: LabJack USB U3/U6 or T4/T7 DAQs, strobed words up to 12bits. The T4/T7 are preferred as the I/O is asynchronous and work on all platforms.
  • Digital I/O: Arduino boards for simple TTL triggers for reward systems, MagStim etc. In particular, digitial TTLs are asynchronous so they do not block the experimental loop. The seeeduino Xiao is small, cheap, fast and works well, as does the Raspberry Pi Pico (using the Arduino IDE interface), but the Uno is also well supported.
  • TouchScreen: PTB can interface many kinds of touchscreen types and we have tested against many different brands. Our [touchManager](touchManager.m add support for touch windows, exclusion zones and more...
  • Eyetracking: Eyelink Eyetrackers -- uses the native ethernet link API. This enables much better two-way control, sending markers and stimulus data while drawing stimuli and experiment values onto the eyelink screen. EDF files are stored after each run and eyelinkAnalysis.m class uses native EDF loading (not ascii proxies) for full trial-by-trial analysis without conversion. See also eyelinkManager.
  • Eyetracking: Tobii Pro Eyetrackers -- using the excellent Titta toolbox to manage calibration, command-response and recording. Tobii Pro eyetrackers do not require head fixation. See tobiiManager.
  • Eyetracking: iRecHS2 -- this low-cost eyetracker nevertheless offers good quality eyetracking, see their paper: A Widely Applicable Real-Time Mono/Binocular Eye Tracking System Using a High Frame-Rate Digital Camera (2017). We use a 500Hz Chameleon camera from FLIR. See See iRecManager.
  • Electrophysiology: in theory any recording system that accepts digital triggers / strobed words; we have dedicated code for the Plexon Omniplex system and can control Intan amplifier software. Opticka can use TCP communication over ethernet to transmit current variable data to allow online data visualisation (PSTHs etc. for each experiment variable) on the Omniplex machine. Digital triggers can be generated with good temporal fidelity.
  • Visual Calibration: we support use of a CRS SpectrolCal II (preferred but expensive) or ColorCal 2, or a VPixx i1Pro, or manual interfacing with most other photometers that PTB supports. See the calibrateLuminance class.
  • Photodiode boxes: we use TSL251R light-to-voltage photodiodes, which can be recorded directy into your electrophysiology system or can generate digital triggers via an Arduino interface.

Quick Documentation

optickatest.m is a minimal example showing a simple script-based method of constants (MOC) experiment with 11 different animated stimuli varying across angle, contrast and orientation. Read the Matlab-generated documentation here: optickatest.m Report. More complex behavioural control (gaze-contingent experiments with variable logic per trial) utilises a state machine, see optickaBehaviourTest.m Report. You can see examples of stateMachine control files in the CoreProtocols folder.

There is auto-generated class documentation here: Opticka Class Docs, that details the major classes and their methods and properties. This is generated from the comments in the code, which as always could be improved…

Basic Install Instructions

Opticka prefers the latest Psychophysics Toolbox (V3.0.17+) and at least MATLAB 2017a (it uses object-oriented property validation introduced in that version). It has been tested on 64bit Ubuntu 20.04 & macOS 12.x with MATLAB 2021b (newer versions are generally faster). You can simply download the GitHub ZIP File, unzip and CD to the folder and run addOptickaToPath.m. Or to keep easily up-to-date if you have git installed, clone this Github repo, CD to the folder then run addOptickaToPath.m.

Opticka currently works on Linux, macOS and Windows. The older LabJack U3/U6 interface currently only works under Linux and macOS; the LabJack T4/T7 does work cross-platform however. Linux is by far the best OS according the PTB developer Mario Kleiner, and receives the majority of development work from him, therefore it is strongly advised to use it for experiments. My experience is that Linux is much more robust and performant than macOS or Windows, and it is well worth the effort to use Linux for PTB experimental computers.

See Detailed instructions for full install details…

Features

  • Values are always specified in eye-relevant co-ordinates (degrees etc.) that are internally calculated based on screen geometry/distance.
  • No limit on the number of independent variables, and variables can be linked to multiple stimuli.
  • A state machine logic can run behavioural tasks driven by e.g. eye position or behavioural response. State machines can flexibly run tasks and chains of states define your experimental loop.
  • Number of heterogeneous stimuli displayed simultaneously only limited by the GPU / computer power. Us of GPU procedural textures wherever possible ensure fast and efficient stimuli.
  • Display lists are used, so one can easily change drawing order (i.e. what stimulus draws over other stimuli), by changing its order in the list.
  • Object-Oriented, allowing stimulus classes to be easily added and code to auto-document its relationships using DOxygen.
  • The set of stimuli and variables can be saved into protocol files, to easily run successive protocols quickly.
  • Fairly comprehensive control of the PTB interface to the drawing hardware, like blending mode, bit depth, windowing, verbosity.
  • Colour is defined in floating point format, takes advantage of higher bit depths in newer graphics cards when available. The buffer can be defined from 8-32bits, use full alpha blending within that space and enable a >8bit output using pseudogrey bitstealing techniques. Supports both Display++ and VPixx display modes.
  • Sub-pixel precision (1/256th pixel) for movement and positioning.
  • TTL output to data acquisition and other devices. Currently uses DataPixx, Display++ or LabJack to interface to the Electrophysiology systems using strobed words.
  • Can communicate with other machines on the network during display using TCP/UDP (used e.g. to control a Plexon online display, so one can see PSTHs for each stimulus variable shown in real time).
  • Each stimulus has its own relative X & Y position, and the screen centre can be arbitrarily moved via the GUI. This allows quick setup over particular parts of visual space, i.e. relative to a receptive field without needing to edit lots of other values.
  • Can record stimuli to video files.
  • Manages monitor calibration using SpectroCalII or ColorCalII from CRG or an i1Pro from ViewPixx. Calibration sets can be loaded, saved and plotted locally via the GUI.
  • Gratings (all using procedural textures for high performance):
    • Per-frame update of properties for arbitrary numbers of grating patches.
    • Rectangular or circular aperture.
    • Cosine or hermite interpolation for filtering grating edges.
    • Square wave gratings, also using a procedural texture, i.e. very fast.
    • Gabors
  • Colour gratings; using any two colors procedurally blended against a background colour.
  • Coherent dot stimuli; coherence expressed from 0-1. Either square or round dots. Colours can be simple, random, random luminance or binary. Kill rates allow random replacement rates for dots. Circularly smoothed masked aperture option. Newsroom style dots with motion distributions etc.
  • Bars; either solid colour or checkerboard / random noise texture. Bars can be animated, direction can be independent of their angle.
  • Flashing smoothed spots.
  • Pictures/Images that can drift and rotate.
  • Movies that can be scaled and drift. Movie playback is double-buffered to allow them to work alongside other stimuli.
  • Hand-mapping module - use mouse controlled dynamic bar / texture / colour to handmap receptive fields; includes logging of clicked position and later printout / storage of hand maps. These maps are in screen co-ordinates for quick subsequent stimulus placement.

Eye tracker control

Opticka supports both Eyelink and Tobii Pro eyetrackers. It does this by developing a unified API, so the same commands are used to manage either eyetracker, meaning you don't need to change code. The API offers a lot of useful features for behavioural control:

  • Fixation windows — one or more screen positions can become targets for which the eye must enter and then remain for a period of time. Windows can be circular or rectangular, and the methods can be 'strict' or not (defining whether quick exit/entries are punished or not, important for training etc.) A single method controls the whole procedure, so for example value = testSearchHoldFixation('correct','fail') runs a timer to allow a subject to search for (timer = initTime), enter and then maintain (timer = time). If the method matches then the first string 'correct' is returned, but if the subject breaks fixation or fails to search then the second string 'fail' is returned. Your code can then simply check this string. There are several variants of these methods depending on your requirements.
  • Exclusion windows — one or more screen positions can be used to "break" the trial if the eye enters their position. So for example if a subject saccades to the wrong side of the screen, the exclusion is triggered.
  • Initiation timers — some subjects try to cheat by guessing a saccadic target. The will therefore initiate a fixation sooner than could be expected. The initiation timer provides a timer (100ms by default), whereby a subject cannot leave a window position on the screen.
  • Calibration — We enable maximum control over calibration for both eyelink (using a custom callback to enable rewards and allowing manual calibration mode for example) and Tobii (using the excellent Titta toolbox). Calibrations can be performed during a task run.
  • Data markers — Eyelink EDF file recording uses a specific set of recommended message markers to define trials and other data. We use the same markers for both eyelink and Tobii recording data, simplifying the subsequent analaysis.
  • Drift correction — Offer a manual single-point drift correction mode for both eye trackers, callable during any task.

Licence

Opticka is licenced under the LGPL3 open source licence.

About

Opticka is an experiment manager built on top of the Psychophysics toolbox (PTB) for MATLAB. It runs experimental tasks using flexible state machine logic and easily does dynamic methods-of-constants type experiments with full behavioural control. It uses a class system to create simple to use visual stimuli using experimenter friendly units. Op…

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •