Skip to content

Commit 9b55b11

Browse files
author
bpinsard
committed
Merge branch 'master' into fsl_b0
* master: (188 commits) fixed paths Back to dev 0.7 release Code cleanup. doc: updated inputs help to indicate FSL and AFNI inaccuracy when using bounds_by_brainmask Fix nipy#493 Fixes nipy#459 fixed doctests changelog fix: ants workflows setup BF: Fixed several issues with GroupAndStack interface DOC: Various clean up an improvements to docstrings. DOC: Added example to LookupMeta docstring ENH: Allow meta_keys input to LookupMeta be a mapping. minor fixes deprecated version should be a string, added AFNITraitedSpec for backward compatibility docs PEP8 name_source can be a list, make "infolder" a deprecated name Removed unnecessary outputsspecs ...
2 parents 464a656 + b36cc9c commit 9b55b11

File tree

108 files changed

+11913
-3369
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

108 files changed

+11913
-3369
lines changed

.travis.yml

Lines changed: 21 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,21 @@
1-
# vim ft=yaml
2-
# travis-ci.org definition for nipy build
3-
#
4-
# We pretend to be erlang because we need can't use the python support in
5-
# travis-ci; it uses virtualenvs, they do not have numpy, scipy, matplotlib,
6-
# and it is impractical to build them
7-
language: erlang
8-
env:
9-
- PYTHON=python PYSUF=''
10-
# - PYTHON=python3 PYSUF=3 : python3-numpy not currently available
11-
install:
12-
- sudo apt-get install $PYTHON-dev
13-
- sudo apt-get install $PYTHON-numpy
14-
- sudo apt-get install $PYTHON-scipy
15-
- sudo apt-get install $PYTHON-networkx
16-
- sudo apt-get install $PYTHON-traits
17-
- sudo apt-get install $PYTHON-setuptools
18-
- sudo easy_install$PYSUF nibabel # Latest pypi
19-
- sudo apt-get install $PYTHON-nose
20-
script:
21-
# Change into an innocuous directory and find tests from installation
22-
- make test
1+
language: python
2+
python:
3+
- "2.7"
4+
before_install:
5+
- deactivate
6+
- sudo apt-get update -qq
7+
- sudo apt-get install lsb-release
8+
- source /etc/lsb-release
9+
- wget -O- http://neuro.debian.net/lists/${DISTRIB_CODENAME}.us-nh | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list
10+
- sudo apt-key adv --recv-keys --keyserver pgp.mit.edu 2649A5A9
11+
- sudo apt-get update -qq
12+
- sudo apt-get install -qq python-scipy python-nose
13+
- sudo apt-get install -qq python-networkx python-traits python-setuptools
14+
- sudo apt-get install -qq python-nibabel
15+
- sudo apt-get install -qq --no-install-recommends fsl afni
16+
- sudo apt-get install -qq fsl-atlases
17+
- source /etc/fsl/fsl.sh
18+
- virtualenv --system-site-packages ~/virtualenv/this
19+
- source ~/virtualenv/this/bin/activate
20+
install: python setup.py build_ext --inplace
21+
script: make test

CHANGES

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,20 @@
11
Next release
22
============
33

4-
* ENH: New interfaces: ICC, Meshfix, ants.Register
4+
5+
Release 0.7.0 (Dec 18, 2012)
6+
============================
7+
8+
* ENH: Add basic support for LSF plugin.
9+
* ENH: New interfaces: ICC, Meshfix, ants.Register, C3dAffineTool, ants.JacobianDeterminant,
10+
afni.AutoTcorrelate, DcmStack
511
* ENH: New workflows: ants template building (both using 'ANTS' and the new 'antsRegistration')
612
* ENH: New examples: how to use ANTS template building workflows (smri_ants_build_tmeplate),
713
how to set SGE specific options (smri_ants_build_template_new)
814
* ENH: added no_flatten option to Merge
15+
* ENH: added versioning option and checking to traits
16+
* ENH: added deprecation metadata to traits
17+
* ENH: Slicer interfaces were updated to version 4.1
918

1019
Release 0.6.0 (Jun 30, 2012)
1120
============================

doc/_templates/sidebar_versions.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ <h3>{{ _('Versions') }}</h3>
1717
<td align="left">Release</td><td align="right">Devel</td>
1818
</tr>
1919
<tr>
20-
<td align="left">0.6.0</td><td align="right">pre-0.7</td>
20+
<td align="left">0.7.0</td><td align="right">pre-0.8</td>
2121
</tr>
2222
<tr>
2323
<td align="left"><a href="{{pathto('users/install')}}">Download</a></td>

doc/devel/interface_specs.rst

Lines changed: 44 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -262,7 +262,50 @@ Common
262262
can be set to either `True` or `False`. `False` indicates that contents
263263
should be symlinked, while `True` indicates that the contents should be
264264
copied over.
265-
265+
266+
``min_ver`` and ``max_ver``
267+
These metadata determine if a particular trait will be available when a
268+
given version of the underlying interface runs. Note that this check is
269+
performed at runtime.::
270+
271+
class RealignInputSpec(BaseInterfaceInputSpec):
272+
jobtype = traits.Enum('estwrite', 'estimate', 'write', min_ver='5',
273+
usedefault=True)
274+
``deprecated`` and ``new_name``
275+
This is metadata for removing or renaming an input field from a spec.::
276+
277+
class RealignInputSpec(BaseInterfaceInputSpec):
278+
jobtype = traits.Enum('estwrite', 'estimate', 'write',
279+
deprecated='0.8',
280+
desc='one of: estimate, write, estwrite',
281+
usedefault=True)
282+
283+
In the above example this means that the `jobtype` input is deprecated and
284+
will be removed in version 0.8. Deprecation should be set to two versions
285+
from current release. Raises `TraitError` after package version crosses the
286+
deprecation version.
287+
288+
For inputs that are being renamed, one can specify the new name of the
289+
field.::
290+
291+
class RealignInputSpec(BaseInterfaceInputSpec):
292+
jobtype = traits.Enum('estwrite', 'estimate', 'write',
293+
deprecated='0.8', new_name='job_type',
294+
desc='one of: estimate, write, estwrite',
295+
usedefault=True)
296+
job_type = traits.Enum('estwrite', 'estimate', 'write',
297+
desc='one of: estimate, write, estwrite',
298+
usedefault=True)
299+
300+
In the above example, the `jobtype` field is being renamed to `job_type`.
301+
When `new_name` is provided it must exist as a trait, otherwise an exception
302+
will be raised.
303+
304+
.. note::
305+
306+
The version information for `min_ver`, `max_ver` and `deprecated` has to be
307+
provided as a string. For example, `min_ver='0.1'`.
308+
266309
CommandLine
267310
^^^^^^^^^^^
268311

doc/documentation.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ Documentation
99
:Release: |version|
1010
:Date: |today|
1111

12-
Previous versions: `0.5.3 <http://nipy.org/nipype/0.5.3>`_ `0.4.1 <http://nipy.org/nipype/0.4.1>`_
12+
Previous versions: `0.6 <http://nipy.org/nipype/0.6>`_ `0.5.3 <http://nipy.org/nipype/0.5.3>`_
1313

1414
.. container:: doc2
1515

doc/index.rst

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,3 @@
1-
.. admonition:: Announcement
2-
3-
Nipype Connectivity Workshop 2012 in Magdeburg, Germany: Sep 8-9, 2012:
4-
`Information and Registration`__
5-
6-
__ http://nipype.blogspot.com
7-
81
.. list-table::
92

103
* - .. image:: images/nipype_architecture_overview2.png

doc/users/config_file.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -109,6 +109,11 @@ Execution
109109
data through (without copying) (possible values: ``true`` and
110110
``false``; default value: ``false``)
111111

112+
*stop_on_unknown_version*
113+
If this is set to True, an underlying interface will raise an error, when no
114+
version information is available. Please notify developers or submit a
115+
patch.
116+
112117
Example
113118
~~~~~~~
114119

doc/users/install.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ This page covers the necessary steps to install Nipype.
99
Download
1010
--------
1111

12-
Release 0.6.0: [`zip <http://github.com/nipy/nipype/zipball/0.6.0>`_ `tar
13-
<http://github.com/nipy/nipype/tarball/0.6.0>`_]
12+
Release 0.7.0: [`zip <https://github.com/nipy/nipype/archive/0.7.zip>`_ `tar.gz
13+
<https://github.com/nipy/nipype/archive/0.7.tar.gz>`_]
1414

1515
Development: [`zip <http://github.com/nipy/nipype/zipball/master>`_ `tar
1616
<http://github.com/nipy/nipype/tarball/master>`_]

doc/users/plugins.rst

Lines changed: 23 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ available plugins allow local and distributed execution of workflows and
99
debugging. Each available plugin is described below.
1010

1111
Current plugins are available for Linear, Multiprocessing, IPython_ distributed
12-
processing platforms and for direct processing on SGE_, PBS_, and Condor_. We
13-
anticipate future plugins for the Soma_ workflow and LSF_.
12+
processing platforms and for direct processing on SGE_, PBS_, Condor_, and LSF_. We
13+
anticipate future plugins for the Soma_ workflow.
1414

1515
.. note::
1616

@@ -34,7 +34,7 @@ Optional arguments::
3434
.. note::
3535

3636
Except for the status_callback, the remaining arguments only apply to the
37-
distributed plugins: MultiProc/IPython(X)/SGE/PBS/Condor
37+
distributed plugins: MultiProc/IPython(X)/SGE/PBS/Condor/LSF
3838

3939
For example:
4040

@@ -71,11 +71,17 @@ a local system.
7171

7272
Optional arguments::
7373

74-
n_procs : Number of processes to launch in parallel
74+
n_procs : Number of processes to launch in parallel, if not set number of
75+
processors/threads will be automatically detected
7576

7677
To distribute processing on a multicore machine, simply call::
7778

78-
workflow.run(plugin='MultiProc', plugin_args={'n_procs' : 2})
79+
workflow.run(plugin='MultiProc')
80+
81+
This will use all available CPUs. If on the other hand you would like to restrict
82+
the number of used resources (to say 2 CPUs), you can call::
83+
84+
workflow.run(plugin='MultiProc', plugin_args={'n_procs' : 2}
7985

8086
IPython
8187
-------
@@ -102,7 +108,7 @@ SGE/PBS
102108
In order to use nipype with SGE_ or PBS_ you simply need to call::
103109

104110
workflow.run(plugin='SGE')
105-
workflow.run(plugin='PBS)
111+
workflow.run(plugin='PBS')
106112

107113
Optional arguments::
108114

@@ -130,6 +136,17 @@ particular node might use more resources than other nodes in a workflow.
130136

131137
node.plugin_args = {'qsub_args': '-l nodes=1:ppn=3', 'overwrite': True}
132138

139+
LSF
140+
---
141+
142+
Submitting via LSF is almost identical to SGE above:
143+
144+
workflow.run(plugin='LSF')
145+
146+
Optional arguments::
147+
148+
template: custom template file to use
149+
bsub_args: any other command line args to be passed to bsub.
133150

134151
Condor
135152
------

examples/dmri_connectivity_advanced.py

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -56,12 +56,13 @@
5656
import nipype.interfaces.cmtk as cmtk
5757
import nipype.interfaces.dipy as dipy
5858
import inspect
59-
import os.path as op # system functions
59+
import os, os.path as op # system functions
6060
from nipype.workflows.dmri.fsl.dti import create_eddy_correct_pipeline
6161
from nipype.workflows.dmri.camino.connectivity_mapping import select_aparc_annot
6262
from nipype.utils.misc import package_check
6363
import warnings
6464
from nipype.workflows.dmri.connectivity.nx import create_networkx_pipeline, create_cmats_to_csv_pipeline
65+
from nipype.workflows.smri.freesurfer import create_tessellation_flow
6566

6667
try:
6768
package_check('cmp')
@@ -82,6 +83,9 @@
8283
fs.FSCommand.set_default_subjects_dir(subjects_dir)
8384
fsl.FSLCommand.set_default_output_type('NIFTI')
8485

86+
fs_dir = os.environ['FREESURFER_HOME']
87+
lookup_file = op.join(fs_dir,'FreeSurferColorLUT.txt')
88+
8589
"""
8690
This needs to point to the fdt folder you can find after extracting
8791
@@ -328,7 +332,7 @@
328332

329333
CFFConverter = pe.Node(interface=cmtk.CFFConverter(), name="CFFConverter")
330334
CFFConverter.inputs.script_files = op.abspath(inspect.getfile(inspect.currentframe()))
331-
giftiSurfaces = pe.Node(interface=util.Merge(8), name="GiftiSurfaces")
335+
giftiSurfaces = pe.Node(interface=util.Merge(9), name="GiftiSurfaces")
332336
giftiLabels = pe.Node(interface=util.Merge(2), name="GiftiLabels")
333337
niftiVolumes = pe.Node(interface=util.Merge(3), name="NiftiVolumes")
334338
fiberDataArrays = pe.Node(interface=util.Merge(4), name="FiberDataArrays")
@@ -344,6 +348,9 @@
344348
NxStatsCFFConverter = pe.Node(interface=cmtk.CFFConverter(), name="NxStatsCFFConverter")
345349
NxStatsCFFConverter.inputs.script_files = op.abspath(inspect.getfile(inspect.currentframe()))
346350

351+
tessflow = create_tessellation_flow(name='tessflow', out_format='gii')
352+
tessflow.inputs.inputspec.lookup_file = lookup_file
353+
347354
"""
348355
Connecting the workflow
349356
=======================
@@ -371,6 +378,9 @@
371378
mapping.connect([(inputnode, FreeSurferSourceRH,[("subjects_dir","subjects_dir")])])
372379
mapping.connect([(inputnode, FreeSurferSourceRH,[("subject_id","subject_id")])])
373380

381+
mapping.connect([(inputnode, tessflow,[("subjects_dir","inputspec.subjects_dir")])])
382+
mapping.connect([(inputnode, tessflow,[("subject_id","inputspec.subject_id")])])
383+
374384
mapping.connect([(inputnode, parcellate,[("subjects_dir","subjects_dir")])])
375385
mapping.connect([(inputnode, parcellate,[("subject_id","subject_id")])])
376386
mapping.connect([(parcellate, mri_convert_ROI_scale500,[('roi_file','in_file')])])
@@ -516,6 +526,7 @@
516526
mapping.connect([(mris_convertRHinflated, giftiSurfaces,[("converted","in6")])])
517527
mapping.connect([(mris_convertLHsphere, giftiSurfaces,[("converted","in7")])])
518528
mapping.connect([(mris_convertRHsphere, giftiSurfaces,[("converted","in8")])])
529+
mapping.connect([(tessflow, giftiSurfaces,[("outputspec.meshes","in9")])])
519530

520531
mapping.connect([(mris_convertLHlabels, giftiLabels,[("converted","in1")])])
521532
mapping.connect([(mris_convertRHlabels, giftiLabels,[("converted","in2")])])

examples/dmri_group_connectivity_mrtrix.py

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -140,12 +140,6 @@
140140

141141
l1pipeline = create_group_connectivity_pipeline(group_list, group_id, data_dir, subjects_dir, output_dir, info)
142142

143-
# This is used to demonstrate the ease through which different parameters can be set for each group.
144-
if group_id == 'parkinsons':
145-
l1pipeline.inputs.connectivity.mapping.threshold_FA.absolute_threshold_value = 0.5
146-
else:
147-
l1pipeline.inputs.connectivity.mapping.threshold_FA.absolute_threshold_value = 0.7
148-
149143
# Here with invert the b-vectors in the Y direction and set the maximum harmonic order of the
150144
# spherical deconvolution step
151145
l1pipeline.inputs.connectivity.mapping.fsl2mrtrix.invert_y = True

0 commit comments

Comments
 (0)