alf.io

Generic ALF I/O module. Provides support for time-series reading and interpolation as per the specifications For a full overview of the scope of the format, see: https://ibllib.readthedocs.io/en/develop/04_reference.html#alf

alf.io.check_dimensions(dico)

Test for consistency of dimensions as per ALF specs in a dictionary. Raises a Value Error.

Alf broadcasting rules: only accepts consistent dimensions for a given axis a dimension is consistent with another if it’s empty, 1, or equal to the other arrays dims [a, 1], [1, b] and [a, b] are all consistent, [c, 1] is not

Parameters:dico – dictionary containing data
Returns:status 0 for consistent dimensions, 1 for inconsistent dimensions
alf.io.exists(alfpath, object, attributes=None, glob='.*')

Test if ALF object and optionally specific attributes exist in the given path :param alfpath: str or pathlib.Path of the folder to look into :param object: str ALF object name :param attributes: list or list of strings for wanted attributes :param glob: (“.*”) glob pattern to look for files or list of parts as per ALF specifications :return: Bool. For multiple attributes, returns True only if all attributes are found

alf.io.get_session_path(path_object)

From a full file path or folder path, gets the session root path :param path_object: pathlib.Path or string :return:

alf.io.is_uuid_string(string)

Bool test to c

alf.io.load_file_content(fil)

Returns content of files. Designed for very generic file formats: so far supported contents are json, npy, csv, tsv, ssv, jsonable

Parameters:fil – file to read

:return:array/json/pandas dataframe depending on format

alf.io.load_object(alfpath, object=None, glob='.*', short_keys=False)

Reads all files (ie. attributes) sharing the same object. For example, if the file provided to the function is spikes.times, the function will load spikes.time, spikes.clusters, spikes.depths, spike.amps in a dictionary whose keys will be time, clusters, depths, amps Full Reference here: https://github.com/cortex-lab/ALF Simplified example: _namespace_object.attribute.part1.part2.extension

Parameters:
  • alfpath – any alf file pertaining to the object OR directory containing files
  • object – if a directory is provided, need to specify the name of object to load
  • glob – a file filter string like one used in glob: “.amps.” for example
  • short_keys – by default, the output dictionary keys will be compounds of attributes and any eventual parts separated by a dot. Use True to shorten the keys to the bare attribute.
Returns:

a dictionary of all attributes pertaining to the object

example: spikes = ibllib.io.alf.load_object(‘/path/to/my/alffolder/’, ‘spikes’)

alf.io.read_ts(filename)

Load time-series from ALF format t, d = alf.read_ts(filename)

alf.io.remove_uuid_file(file_path, dry=False)

Renames a file without the UUID and returns the new pathlib.Path object

alf.io.remove_uuid_recursive(folder, dry=False)

Within a folder, recursive renaming of all files to remove UUID

alf.io.save_metadata(file_alf, dico)

Writes a meta data file matching a current alf file object. For example given an alf file clusters.ccf_location.ssv this will write a dictionary in json format in clusters.ccf_location.metadata.json Reserved keywords:

  • columns: column names for binary tables.
  • row: row names for binary tables.
  • unit
Parameters:
  • file_alf – full path to the alf object
  • dico – dictionary containing meta-data.
Returns:

None

alf.io.save_object_npy(alfpath, dico, object, parts='')

Saves a dictionary in alf format using object as object name and dictionary keys as attribute names. Dimensions have to be consistent. Reference here: https://github.com/cortex-lab/ALF Simplified example: _namespace_object.attribute.part1.part2.extension

Parameters:
  • alfpath – path of the folder to save data to
  • dico – dictionary to save to npy
  • object – name of the object to save
  • parts – extra parts to the ALF name
Returns:

List of written files

example: ibllib.io.alf.save_object_npy(‘/path/to/my/alffolder/’, spikes, ‘spikes’)