Giter Site home page Giter Site logo

hyperopt's Introduction

Hyperopt: Distributed Hyperparameter Optimization

build pre-commit.ci status PyPI version Anaconda-Server Badge

Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions.

Getting started

Install hyperopt from PyPI

pip install hyperopt

to run your first example

# define an objective function
def objective(args):
    case, val = args
    if case == 'case 1':
        return val
    else:
        return val ** 2

# define a search space
from hyperopt import hp
space = hp.choice('a',
    [
        ('case 1', 1 + hp.lognormal('c1', 0, 1)),
        ('case 2', hp.uniform('c2', -10, 10))
    ])

# minimize the objective over the space
from hyperopt import fmin, tpe, space_eval
best = fmin(objective, space, algo=tpe.suggest, max_evals=100)

print(best)
# -> {'a': 1, 'c2': 0.01420615366247227}
print(space_eval(space, best))
# -> ('case 2', 0.01420615366247227}

Contributing

If you're a developer and wish to contribute, please follow these steps.

Setup (based on this)

  1. Create an account on GitHub if you do not already have one.

  2. Fork the project repository: click on the ‘Fork’ button near the top of the page. This creates a copy of the code under your account on the GitHub user account. For more details on how to fork a repository see this guide.

  3. Clone your fork of the hyperopt repo from your GitHub account to your local disk:

    git clone https://github.com/<github username>/hyperopt.git
    cd hyperopt
  4. Create environment with:
    $ python3 -m venv my_env or $ python -m venv my_env or with conda:
    $ conda create -n my_env python=3

  5. Activate the environment:
    $ source my_env/bin/activate
    or with conda:
    $ conda activate my_env

  6. Install dependencies for extras (you'll need these to run pytest): Linux/UNIX: $ pip install -e '.[MongoTrials, SparkTrials, ATPE, dev]'

    or Windows:

    pip install -e .[MongoTrials]
    pip install -e .[SparkTrials]
    pip install -e .[ATPE]
    pip install -e .[dev]
  7. Add the upstream remote. This saves a reference to the main hyperopt repository, which you can use to keep your repository synchronized with the latest changes:

    $ git remote add upstream https://github.com/hyperopt/hyperopt.git

    You should now have a working installation of hyperopt, and your git repository properly configured. The next steps now describe the process of modifying code and submitting a PR:

  8. Synchronize your master branch with the upstream master branch:

    git checkout master
    git pull upstream master
  9. Create a feature branch to hold your development changes:

    $ git checkout -b my_feature

    and start making changes. Always use a feature branch. It’s good practice to never work on the master branch!

  10. We recommend to use Black to format your code before submitting a PR which is installed automatically in step 6.

  11. Then, once you commit ensure that git hooks are activated (Pycharm for example has the option to omit them). This can be done using pre-commit, which is installed automatically in step 6, as follows:

    pre-commit install

    This will run black automatically when you commit on all files you modified, failing if there are any files requiring to be blacked. In case black does not run execute the following:

    black {source_file_or_directory}
  12. Develop the feature on your feature branch on your computer, using Git to do the version control. When you’re done editing, add changed files using git add and then git commit:

    git add modified_files
    git commit -m "my first hyperopt commit"
  13. The tests for this project use PyTest and can be run by calling pytest.

  14. Record your changes in Git, then push the changes to your GitHub account with:

    git push -u origin my_feature

Note that dev dependencies require python 3.6+.

Algorithms

Currently three algorithms are implemented in hyperopt:

Hyperopt has been designed to accommodate Bayesian optimization algorithms based on Gaussian processes and regression trees, but these are not currently implemented.

All algorithms can be parallelized in two ways, using:

Documentation

Hyperopt documentation can be found here, but is partly still hosted on the wiki. Here are some quick links to the most relevant pages:

Related Projects

Examples

See projects using hyperopt on the wiki.

Announcements mailing list

Announcements

Discussion mailing list

Discussion

Cite

If you use this software for research, please cite the paper (http://proceedings.mlr.press/v28/bergstra13.pdf) as follows:

Bergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. TProc. of the 30th International Conference on Machine Learning (ICML 2013), June 2013, pp. I-115 to I-23.

Thanks

This project has received support from

  • National Science Foundation (IIS-0963668),
  • Banting Postdoctoral Fellowship program,
  • National Science and Engineering Research Council of Canada (NSERC),
  • D-Wave Systems, Inc.

hyperopt's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hyperopt's Issues

blocking run in mongoexperiment

So in my application I want be able to set off a master process that blocks until all mongoexperiment jobs are done
e.g. somethign that continually polls until there's nothing left in the queue. Current MongoExperiment.run only blocks until the last job is set off.

I suspect adding the feature I want is an easy addition to the run method. Do you think that having this kind of blocking as an keyward option ("block_until_done" or something) would be useful?

mongoexp duplicate detection is thwarted by _config_id mechanism

Do we really need duplicate detection anyway? MongoExperiments are asynchronous so it is really unusual to ever have exactly the same experiment twice. The dup-detection seems mainly useful in really small search spaces, which isn't really what the package is for.

I think the right thing to do is delete the so-called "duplicate detection" mechanism code.

average best error over time

new plot idea

  • log scale time
  • at points 50, 100, 200, 400, etc. call average best error using jobs up to that point
  • plot average best error with error bars

This can meaningfully compare random search with non-random search algorithms.

MongoExperiment.min_queue_len behavior

In MongoExperiment.run there's this block of code:

while n_queued < N and self.queue_len() < self.min_queue_len:
     [... submit jobs ....]

@jaberg, could you explain the intention of requiring self.queue_len() < self.min_queue_len? By default self.min_queue_len = 1, which means that (since the job submission is faster than job execution) the while condition is false after the first job submission, so only one job gets submitted.

Is it your intention for the user to set self.min_queue_len much larger somehow? Or is this a bug? Or what?

Weird Theano error about expected type_num on Linux but not on Mac

I've tried the getting hyperopt tutorials to work on a couple of Linux machines.
Using the random bandit works ok, i.e.:

hyperopt-search ab.AvsB hyperopt.bandit_algos.Random

writes me my experiment.pkl file.

However, when I try the next example, I get a strange message from the depths of Theano.

~/src/i686/hyperopt/bin/hyperopt-search ab.AvsB hyperopt.bandit_algos.AdaptiveParzenGM

log_thunk_trace: There was a problem executing an Op.
Traceback (most recent call last):
  File "/home/gtaylor/src/i686/hyperopt/bin/hyperopt-search", line 4, in <module>
    sys.exit(hyperopt.experiments.main_search())
  File "/home/gtaylor/src/i686/hyperopt/hyperopt/experiments.py", line 100, in main_search
    self.run(options.steps)
  File "/home/gtaylor/src/i686/hyperopt/hyperopt/experiments.py", line 29, in run
    trial = algo.suggest(self.trials, self.results, 1)[0]
  File "/home/gtaylor/src/i686/hyperopt/hyperopt/theano_gm.py", line 211, in suggest
    return self.suggest_ivl(self.suggest_from_prior(N))
  File "/home/gtaylor/src/i686/hyperopt/hyperopt/theano_gm.py", line 115, in suggest_from_prior
    rvals = prior_sampler(N)
  File "/home/gtaylor/src/i686/Theano/theano/compile/function_module.py", line 610, in __call__
    self.fn()
  File "/home/gtaylor/src/i686/Theano/theano/gof/link.py", line 344, in streamline_default_f
    raise_with_op(node)
  File "/home/gtaylor/src/i686/Theano/theano/gof/link.py", line 340, in streamline_default_f
    thunk()
  File "/home/gtaylor/src/i686/Theano/theano/gof/op.py", line 537, in rval
    fill_storage()
  File "/home/gtaylor/src/i686/Theano/theano/gof/cc.py", line 1178, in __call__
    raise exc_type, exc_value, exc_trace
ValueError: ('expected type_num 9 (NPY_INT64) got 5', <TensorType(int64, vector)>)

Strangely enough, it works fine on my Mac. On both Mac and Linux I am using the Enthought Python distribution 7.1.
This supplies Python 2.7.2.
Both are using Theano and hyperopt from git (master @ HEAD right now).
Basically what I am trying to say is that my working and non-working setup are about as close as they can be (except for the fact that they are different platforms).

I haven't experienced any other Theano-related problems on the Linux machine.

hyperopt-search load behavior is sneaky

$ hyperopt-search algo bandit1
CTRL-C
$ hyperopt-search algo bandit2

This actually just continues running bandit1, which was saved.

if you run a search command it might not use the
bandit and algorithm that you wrote, because it secretly loaded an old
experiment.pkl file. What's a better interface for this?

Maybe the default --load should be an empty string?

example of using hyperopt with non-Python Bandit

Minimally - this can be done by writing config dictionaries to file, and calling other program via subprocess.

For full interaction via the Ctrl object, the equivalent to mongoexp.CtrlObj must be implemented in the foreign language for use by a worker process written in the other language.

This issue was raised in conversation with @alextp

bandits failing silently

Current it seems that when bandit.evaluation throws an exception, the basic Experiment object run method fails silently.

I have a test to contribute for this if it is not supposed to happen this way.

CategoryKernel with all pairs dissimilarity

Currently choices for the GP's categorical variable are just same/different. It would probably be better to have separate parameters for the dissimilarity of each pair (a,b) where a != b

Improve search efficiency of mongoexp by delaying suggest()

It is inefficient for mongoexp to call bandit_algo.suggest() as soon as the queue is empty. It would be better for it to wait until a worker is idle. The workers should be posting that kind of info to the worker collection anyway, so the MongoExperiment should be able to use it without too much rewriting.

have workers accept exp_key

so that you can run different workers on different kinds of machines from the same database at the same time.

e.g. one experiment that send out a few big jobs to a GPU cluster and one that sends many small jobs to a CPU cluster ... without having to have different DBs

GP - support for multiple variables

Add 2 tests to test_theano_gp that uses two normals

  1. both are equally important in the evaluate()
  2. one is irrelevant to evaluate()

optimize both, and show that the length scales are about equal in the first case, but far from equal in the second case.

qlen computation in base.Trials.run

I THINK that the follow statement should be true:

  1. If you initialize an experiment to have async = False

  2. if you're in a totally serial execution environment

  3. If max_queue_len = 1

THEN: during an experiment run, there should be no circumstance in which suggest is called twice before serial_execute is called once.

However, currently this is not the case, because even if a job has just been suggested in the inner while loop of the run method, the line
qlen = get_queue_len()
can still return 0, before the serial_execute to run the job has been called.

I think this is a bug.

Exception in theano_gm

Suddenly in a middle of the job last night, theano_gm produce the following exception:

Traceback (most recent call last):
File "/home/tools/centos5_cpu/hyperopt/bin/hyperopt-mongo-search", line 8, in
sys.exit(hyperopt.mongoexp.main_search())
File "/home/tools/centos5_cpu/hyperopt/hyperopt/mongoexp.py", line 1309, in main_search
self.run(options.steps, block_until_done=options.block)
File "/home/tools/centos5_cpu/hyperopt/hyperopt/mongoexp.py", line 687, in run
suggestions = algo.suggest(self.trials, self.results, 1)
File "/home/tools/centos5_cpu/hyperopt/hyperopt/theano_gm.py", line 213, in suggest
return self.suggest_ivl(self.suggest_from_model(ivls, N))
File "/home/tools/centos5_cpu/hyperopt/hyperopt/theano_gm.py", line 152, in suggest_from_model
x_all.idxset())
AssertionError: (set([2048, 2049, 2050, 2051, 2052, 2053, 2054, 2055, 2056, 2057, 2058, 2059, 2060, 2061, 2062, 2063, 2064, 2065, 2066, 2067, 2068, 2069, 2070, 2071, 2072, 2073, 2074, 2075, 2076, 2077, 2078, 2079, 1315, 1316, 1317, 1318, 1319, 1320, 1321, 1322, 1323, 1325, 1326, 1328, 1330, 1332, 1334, 1336, 1338, 1340, 1342, 1349, 1351, 1353, 1355, 1359, 1361, 1363, 1365, 1367, 1369, 1371, 1373, 1375, 1377, 1379, 1380, 1381, 1382, 1383, 1384, 1385, 1386, 1387, 1388, 1389, 1390, 1391, 1392, 1393, 1394, 1395, 1396, 1397, 1398, 1399, 1400, 1401, 1402, 1403, 1404, 1405, 1406, 1407, 1408, 1409, 1410, 1411, 1412, 1413, 1414, 1415, 1416, 1417, 1418, 1419, 1420, 1421, 1422, 1423, 1424, 1425, 1426, 1427, 1428, 1429, 1430, 1431, 1432, 1433, 1434, 1435, 1436, 1437, 1438, 1439, 1441, 1443, 1444, 1445, 1446, 1447, 1448, 1449, 1450, 1451, 1453, 1454, 1455, 1456, 1457, 1458, 1459, 1460, 1461, 1462, 1464, 1465, 1466, 1467, 1468, 1469, 1470, 1471, 1472, 1473, 1474, 1475, 1476, 1477, 1478, 1479, 1480, 1481, 1482, 1483, 1484, 1486, 1487, 1488, 1489, 1490, 1491, 1492, 1493, 1494, 1495, 1496, 1497, 1498, 1499, 1500, 1501, 1502, 1503, 1504, 1505, 1506, 1507, 1508, 1509, 1510, 1511, 1512, 1513, 1514, 1515, 1516, 1517, 1518, 1520, 1521, 1522, 1523, 1524, 1525, 1526, 1527, 1528, 1529, 1530, 1531, 1532, 1533, 1534, 1535, 1536, 1537, 1538, 1539, 1540, 1541, 1542, 1543, 1545, 1546, 1547, 1548, 1549, 1550, 1551, 1552, 1553, 1554, 1555, 1556, 1557, 1558, 1559, 1560, 1561, 1562, 1563, 1564, 1565, 1566, 1567, 1568, 1569, 1570, 1571, 1572, 1573, 1574, 1575, 1576, 1577, 1578, 1579, 1580, 1581, 1582, 1583, 1584, 1585, 1586, 1587, 1588, 1589, 1590, 1591, 1592, 1593, 1594, 1595, 1596, 1597, 1598, 1599, 1600, 1601, 1602, 1603, 1604, 1605, 1606, 1607, 1608, 1609, 1610, 1611, 1612, 1613, 1614, 1615, 1616, 1617, 1618, 1619, 1620, 1621, 1622, 1623, 1624, 1625, 1626, 1627, 1628, 1629, 1630, 1631, 1632, 1633, 1634, 1635, 1636, 1637, 1638, 1639, 1640, 1641, 1642, 1643, 1644, 1645, 1646, 1647, 1648, 1649, 1650, 1651, 1652, 1653, 1654, 1655, 1656, 1657, 1658, 1659, 1660, 1661, 1662, 1663, 1664, 1665, 1666, 1667, 1668, 1669, 1670, 1671, 1672, 1673, 1674, 1675, 1676, 1677, 1678, 1679, 1680, 1681, 1682, 1683, 1684, 1685, 1686, 1687, 1688, 1689, 1690, 1691, 1692, 1693, 1694, 1695, 1696, 1697, 1698, 1699, 1700, 1701, 1702, 1703, 1704, 1705, 1706, 1707, 1708, 1709, 1710, 1711, 1712, 1713, 1714, 1715, 1716, 1717, 1718, 1719, 1720, 1721, 1722, 1723, 1724, 1725, 1726, 1727, 1728, 1729, 1730, 1731, 1732, 1733, 1734, 1735, 1736, 1737, 1738, 1739, 1740, 1741, 1742, 1743, 1744, 1745, 1746, 1747, 1748, 1749, 1750, 1751, 1752, 1753, 1754, 1755, 1756, 1757, 1758, 1759, 1760, 1761, 1762, 1763, 1764, 1765, 1766, 1767, 1768, 1769, 1770, 1771, 1772, 1773, 1774, 1775, 1776, 1777, 1778, 1779, 1780, 1781, 1782, 1783, 1784, 1785, 1786, 1787, 1788, 1789, 1790, 1791, 1792, 1793, 1794, 1795, 1796, 1797, 1798, 1799, 1800, 1801, 1802, 1803, 1804, 1805, 1806, 1807, 1808, 1809, 1810, 1811, 1812, 1813, 1814, 1815, 1816, 1817, 1818, 1819, 1820, 1821, 1822, 1823, 1824, 1825, 1826, 1828, 1829, 1830, 1831, 1832, 1833, 1834, 1835, 1836, 1837, 1838, 1839, 1840, 1841, 1842, 1843, 1844, 1845, 1846, 1847, 1848, 1849, 1850, 1851, 1852, 1853, 1854, 1855, 1856, 1857, 1858, 1859, 1860, 1861, 1862, 1863, 1864, 1865, 1866, 1867, 1868, 1869, 1870, 1871, 1872, 1873, 1874, 1875, 1876, 1877, 1878, 1879, 1880, 1881, 1882, 1883, 1884, 1885, 1886, 1887, 1888, 1889, 1890, 1891, 1892, 1893, 1894, 1895, 1896, 1897, 1898, 1899, 1900, 1901, 1902, 1903, 1904, 1905, 1906, 1907, 1908, 1909, 1910, 1911, 1912, 1913, 1914, 1915, 1916, 1917, 1918, 1919, 1920, 1921, 1922, 1923, 1924, 1925, 1926, 1927, 1928, 1929, 1930, 1931, 1932, 1933, 1934, 1935, 1936, 1937, 1938, 1939, 1940, 1941, 1942, 1943, 1944, 1945, 1946, 1947, 1948, 1949, 1950, 1951, 1952, 1953, 1954, 1955, 1956, 1957, 1958, 1959, 1960, 1961, 1962, 1963, 1964, 1965, 1966, 1967, 1968, 1969, 1970, 1971, 1972, 1973, 1974, 1975, 1976, 1977, 1978, 1979, 1980, 1981, 1982, 1983, 1984, 1985, 1986, 1987, 1988, 1989, 1990, 1991, 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023, 2024, 2025, 2026, 2027, 2028, 2029, 2030, 2031, 2032, 2033, 2034, 2035, 2036, 2037, 2038, 2039, 2040, 2041, 2042, 2043, 2044, 2045, 2046, 2047]), set([]))

undefined symbol: what am I missing?

sys.exit(hyperopt.mongoexp.main_search())

File "/home/tools/centos5_cpu/hyperopt/hyperopt/mongoexp.py", line 1309, in main_search
self.run(options.steps, block_until_done=options.block)
File "/home/tools/centos5_cpu/hyperopt/hyperopt/mongoexp.py", line 687, in run
suggestions = algo.suggest(self.trials, self.results, 1)
File "/home/tools/centos5_cpu/hyperopt/hyperopt/theano_gp.py", line 1338, in suggest
rval = self.suggest_ivl(fn(trials, results, N))
File "/home/tools/centos5_cpu/hyperopt/hyperopt/theano_gp.py", line 1248, in suggest_from_gp
self.fit_GP(*prepared_data)
File "/home/tools/centos5_cpu/hyperopt/hyperopt/theano_gp.py", line 965, in fit_GP
mode=self.mode,
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/compile/function.py", line
114, in function
profile=profile)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/compile/pfunc.py", line 439
, in pfunc
accept_inplace=accept_inplace, name=name, profile=profile)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/compile/function_module.py"
, line 1238, in orig_function
defaults)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/compile/function_module.py"
, line 1088, in create
_fn, _i, _o = self.linker.make_thunk(input_storage = input_storage_lists)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/link.py", line 377, in
make_thunk
output_storage = output_storage)[:3]
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cc.py", line 1261, in m
ake_all
no_recycling)]
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/op.py", line 532, in ma
ke_thunk
output_storage=node_output_storage)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cc.py", line 786, in ma
ke_thunk
keep_lock=keep_lock)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cc.py", line 734, in compile
keep_lock=keep_lock)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cc.py", line 1092, in c
thunk_factory
module = get_module_cache().module_from_key(key=key, fn=self.compile_cmodule_by_step, keep_loc
k=keep_lock) File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cmodule.py", line 839,
in module_from_key
module = compile_steps.next()
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cc.py", line 1019, in c
ompile_cmodule_by_step
preargs=preargs) File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cmodule.py", line 1401,
in gcc_module_compile_str
return dlimport(lib_filename)
File "/opt/lib/python2.6/site-packages/Theano-0.4.1-py2.6.egg/theano/gof/cmodule.py", line 186,
in dlimport
rval = import(module_name, {}, {}, [module_name])

ImportError: ('/root/.theano/compiledir_Linux-2.6.21.7-2.fc8xen-x86_64-with-redhat-5.5-Final-x86_6
4-2.6.5/tmpXluXUO/522e60758804332fe52eaa426b89aab8.so: undefined symbol: dgemm_', '[_dot22scalar(<
TensorType(float64, col)>, <TensorType(float64, row)>, TensorConstant{2.0})]')

global indexes in IdxsVals

It would be easier to trace and debug things if global index (as opposed to relative index values) were used in IdxsVals and IdxsValsList. This goes mainly for GM_Algo.

  • This means that some methods might have to take IdxsVals pairs for Y instead of the current dense representation.

mongoexp trials "cmd" parameter values

It now seems necessary to specify a non-default value for the "cmd" parameter to the experiment object to be able to run mongo trials. (though not for serial trials, "None" works fine there)

Is this intended? What are the possible values and how should they be used?

I see from tests how in many cases I should probably configure it, but it seems clunky.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.