Input-output manifest

Examples

Noise exposure

Basic configuration of a system with one output (connected to a speaker) and two inputs (from microphones) driven by a National Instruments DAQ card. This configuration is about as simple as it gets.

from enaml.workbench.api import PluginManifest, Extension

from psi.controller.engines.nidaq import (NIDAQEngine, 
                                          NIDAQHardwareAIChannel,
                                          NIDAQHardwareAOChannel,
                                          NIDAQSoftwareDOChannel)


enamldef IOManifest(PluginManifest): manifest:
    '''
    Example of a simple configuration for noise exposure
    '''
    Extension:
        id = 'backend'
        point = 'psi.controller.io'

        NIDAQEngine:
            # Each engine must have a unique name.
            name = 'NI'

            master_clock = True

            hw_ai_monitor_period = 0.1
            hw_ao_monitor_period = 1

            NIDAQHardwareAOChannel:
                # Label as shown in the GUI
                label = 'Noise exposure speaker'

                # Label as used in the code
                name = 'speaker'

                # Sampling rate the channel runs at. The engine may impose some
                # constraints on this sampling rate. For example, all analog
                # output channels configured on a particular NIDAQmx engine
                # must run at the same sampling rate.
                fs = 25e3

                # The data type required by the channel.
                dtype = 'float64'

                # This is a NIDAQmx-specific feature and is the channel
                # identifier used by the NIDAQmx library. Channels that
                # interface with other types of hardware will have their own
                # method for identifying channels.
                channel = 'Dev1/ao1'

                # Also a NIDAQmx-specific feature. This allows the NIDAQmx
                # library to optimize the channel configuration based on the
                # expected output range. 
                expected_range = (-10, 10)

            NIDAQHardwareAIChannel:
                label = 'Experiment microphone'
                name = 'experiment_microphone'
                channel = 'Dev1/ai2'
                start_trigger = 'ao/StartTrigger'
                fs = 100e3
                expected_range = (-10, 10)
                dtype = 'float64'
                terminal_mode = 'differential'
                gain = 20

            NIDAQHardwareAIChannel:
                label = 'Calibration microphone'
                name = 'calibration_microphone'
                channel = 'Dev1/ai1'
                start_trigger = 'ao/StartTrigger'
                fs = 100e3
                expected_range = (-10, 10)
                dtype = 'float64'
                terminal_mode = 'differential'
                gain = 0

Appetitive go-nogo behavior

Example configuration of a system designed for appetitive go-nogo behavior where the subject must nose-poke to start a trial and retrieve their reward from a food hopper. Both the nose-poke and food hopper have an infrared beam (photoemitter to photosensor) that generate an analog signal indicating the intensity of light falling on the photosensor. If the path between the photoemitter and photosensor is blocked (e.g., by the subject’s nose), then the analog readout will reflect the change in light intensity. The analog readout of the photosensors are connected to the nose_poke and reward_contact channels.

For a go-nogo behavioral task, we need to convert this analog readout to a binary signal indicating whether the subject broke the infrared beam or not. In the following example we create a new processing chain, AnalogToDigitalFilter that performs this conversion and apply it to both the nose-poke and food hopper inputs.

from enaml.workbench.api import Extension, PluginManifest

from psi.controller.engines.nidaq import (NIDAQEngine,
                                          NIDAQHardwareAIChannel,
                                          NIDAQHardwareAOChannel,
                                          NIDAQSoftwareDOChannel)

from psi.controller.api import (CalibratedInput, Downsample, Edges, IIRFilter,
                                Threshold, Trigger, Toggle)


enamldef IRChannel(NIDAQHardwareAIChannel): irc:
    unit = 'V'
    start_trigger = 'ao/StartTrigger'
    fs = 100e3
    expected_range = (-10, 10)
    dtype = 'float64'
    terminal_mode = 'differential'

    IIRFilter: iir:
        name << irc.name + '_filtered'
        f_lowpass = 25
        ftype = 'butter'
        btype = 'lowpass'

        Downsample: ds:
            name << irc.name + '_analog'
            q = 1000
            Threshold: th:
                threshold = 2.5
                Edges: e:
                    name << irc.name + '_digital'
                    debounce = 2




enamldef IOManifest(PluginManifest): manifest:
    '''
    This defines the hardware connections that are specific to the LBHB Bobcat
    computer for the appetitive experiment.
    '''

    Extension:
        id = 'backend'
        point = 'psi.controller.io'

        NIDAQEngine: engine:
            name = 'NI'
            master_clock = True

            # Since we're using an AnalogThreshold input to detect nose pokes
            # and reward contact, we want a fairly short AI monitor period to
            # ensure that we detect these events quickly.
            hw_ai_monitor_period = 0.025
            hw_ao_monitor_period = 1

            NIDAQHardwareAOChannel:
                label = 'Speaker'
                name = 'speaker'
                channel = 'Dev1/ao0'
                fs = 100e3
                expected_range = (-10, 10)
                dtype = 'float64'
                terminal_mode = 'RSE'
                calibration_user_editable = True

            NIDAQSoftwareDOChannel:
                name = 'food_dispense'
                channel = 'Dev1/port0/line0'

                Trigger:
                    # This is a required output for the food dispenser. The
                    # plugin will look for this output by name. If not present,
                    # the food dispenser plugin will not work!
                    label = 'Food dispense'
                    name = 'food_dispense_trigger'
                    duration = 0.1

            NIDAQSoftwareDOChannel:
                name = 'room_light'
                channel = 'Dev1/port0/line1'

                Toggle:
                    # This is a required output for the room light. The plugin
                    # will look for this output by name. If not present, the
                    # room light plugin will not work!
                    name = 'room_light_toggle'
                    label = 'Room light'

            IRChannel:
                label = 'Nose poke IR'
                name = 'nose_poke'
                channel = 'Dev1/ai0'

            IRChannel:
                label = 'Reward IR'
                name = 'reward_contact'
                channel = 'Dev1/ai1'

In the example code above, you’ll note that we defined a Toggle output named room_light_toggle. If you look at the experiment manifest for appetitive experiments, you’ll see that we’ve defined two actions that control this output:

ExperimentAction:
    event = 'to_start'
    command = 'room_light_toggle.off'

ExperimentAction:
    event = 'to_end'
    command = 'room_light_toggle.on'

The to_start and to_end events are generated by the appetitive controller when a timeout begins and ends. The rules above result in turning off the room light when a timeout beings and turning it back on when the timeout ends. By creating a rules-based action system, it simplifies the process of ensuring that a sequence of actions occur

In theory, we could configure the room light such that it was controlled by an Arduino if we had an Arduino backend implemented:

ArduinoEngine:
    DigitalOutput:
        name = 'room_light'
        channel = 'Pin1'

The engine is responsible for:

  • Configuring the input and output channels.

  • Responding to requests (e.g., uploading waveforms to analog output channels and toggling the state of digital output channels).

  • Continuously pollign inptu channels and passing this through the input processing pipeline. All data acquisition is continuous (if you want epoch-based acquisition, you would add a special input, ExtractEpochs, to your input hierarchy).

You can have multiple engines in a single experiment.