Giter Site home page Giter Site logo

devtools's Introduction

devtools

R-CMD-check Codecov test coverage CRAN_Status_Badge

The aim of devtools is to make package development easier by providing R functions that simplify and expedite common tasks. R Packages is a book based around this workflow.

Installation

# Install devtools from CRAN
install.packages("devtools")

# Or the development version from GitHub:
# install.packages("pak")
pak::pak("r-lib/devtools")

Cheatsheet

thumbnail of package development cheatsheet

Usage

All devtools functions accept a path as an argument, e.g. load_all("path/to/mypkg"). If you don't specify a path, devtools will look in the current working directory - this is a recommended practice.

Frequent development tasks:

  • load_all() simulates installing and reloading your package, loading R code in R/, compiled shared objects in src/ and data files in data/. During development you would usually want to access all functions (even un-exported internal ones) so load_all() works as if all functions were exported in the package NAMESPACE.

  • document() updates generated documentation in man/, file collation and NAMESPACE.

  • test() reloads your code with load_all(), then runs all testthat tests.

  • test_coverage() runs test coverage on your package with covr. This makes it easy to see what parts of your package could use more tests!

Building and installing:

  • install() reinstalls the package, detaches the currently loaded version then reloads the new version with library(). Reloading a package is not guaranteed to work: see the documentation for unload() for caveats.

  • build() builds a package file from package sources. You can use it to build a binary version of your package.

  • install_* functions install an R package:

    • install_github() from GitHub
    • install_gitlab() from GitLab
    • install_bitbucket() from Bitbucket
    • install_url() from an arbitrary url
    • install_git() and install_svn() from an arbitrary git or SVN repository
    • install_local() from a local file on disk
    • install_version() from a specific version on CRAN
  • update_packages() updates a package to the latest version. This works both on packages installed from CRAN as well as those installed from any of the install_* functions.

Check and release:

  • check() updates the documentation, then builds and checks the package locally. check_win() checks a package using win-builder, and check_rhub() checks a package using r-hub. This allows you to easily check your package on all systems CRAN uses before submission.

  • release() makes sure everything is ok with your package (including asking you a number of questions), then builds and uploads to CRAN.

Learning more

R package development can be intimidating, however there are now a number of valuable resources to help!

Cover image of R Packages book

  1. R Packages is a book that gives a comprehensive treatment of all common parts of package development and uses devtools throughout.

    • The first edition is available at https://r-pkgs.org/, but note that it has grown somewhat out of sync with the current version of devtools.
    • A second edition is under development and is evolving to reflect the current state of devtools. It is available at https://r-pkgs.org.
    • The Whole Game and Package structure chapters make great places to start.
  2. RStudio community - package development is a great place to ask specific questions related to package development.

  3. rOpenSci packages has extensive documentation on best practices for R packages looking to be contributed to rOpenSci, but also very useful general recommendations for package authors.

  4. There are a number of fantastic blog posts on writing your first package, including

  5. Writing R Extensions is the exhaustive, canonical reference for writing R packages, maintained by the R core developers.

Conscious uncoupling

devtools started off as a lean-and-mean package to facilitate local package development, but over the years it accumulated more and more functionality. devtools has undergone a conscious uncoupling to split out functionality into smaller, more tightly focussed packages. This includes:

  • testthat: Writing and running tests (i.e. test()).

  • roxygen2: Function and package documentation (i.e. document()).

  • remotes: Installing packages (i.e. install_github()).

  • pkgbuild: Building binary packages (including checking if build tools are available) (i.e. build()).

  • pkgload: Simulating package loading (i.e. load_all()).

  • rcmdcheck: Running R CMD check and reporting the results (i.e. check()).

  • revdepcheck: Running R CMD check on all reverse dependencies, and figuring out what's changed since the last CRAN release (i.e. revdep_check()).

  • sessioninfo: R session info (i.e. session_info()).

  • usethis: Automating package setup (i.e. use_test()).

Generally, you would not need to worry about these different packages, because devtools installs all of them automatically. You will need to care, however, if you're filing a bug because reporting it at the correct place will lead to a speedier resolution.

You may also need to care if you are trying to use some devtools functionality in your own package or deployed application. Generally in these cases it is better to depend on the particular package directly rather than depend on devtools, e.g. use sessioninfo::session_info() rather than devtools::session_info(), or remotes::install_github() vs devtools::install_github().

However for day to day development we recommend you continue to use library(devtools) to quickly load all needed development tools, just like library(tidyverse) quickly loads all the tools necessary for data exploration and visualization.

Code of conduct

Please note that the devtools project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.

devtools's People

Contributors

ashander avatar ateucher avatar bpbond avatar brentonk avatar craigcitro avatar gaborcsardi avatar geoff99 avatar hadley avatar henrikbengtsson avatar hmalmedal avatar imanuelcostigan avatar jennybc avatar jeroen avatar jiho avatar jimhester avatar kevinushey avatar klmr avatar kohske avatar krlmlr avatar lev-kuznetsov avatar lionel- avatar maelle avatar malcolmbarrett avatar ncarchedi avatar richfitz avatar rmflight avatar robertzk avatar wch avatar yihui avatar yoni avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

devtools's Issues

document() should have an option not to generate the NAMESPACE

Sometimes I want to update my documentation .Rd files, but not rewrite my NAMESPACE, but I don't see this as an option for the document function. In particular, it seems that the order that commands appear in NAMESPACE matters, but document doesn't generate a consistent order. There may be other reasons as well that a developer wants to update the .Rd without messing with the NAMESPACE. Thanks for considering this and sorry if I've misunderstood how this works.

load_all handling of imported libraries

It appears to me that the load_all() function does not load libraries that have been imported. This means that I cannot simply call load_all() on my package and expect to run the functions it provides until I explicitly load the imported libraries. I might expect it to load all imported libraries as well as all the functions defined in the package.

Problem with load_all() and loadRcppModules()

Hi,

I'm having trouble using load_all() on a package that uses Rcpp modules. I load my Rcpp modules using the loadRcppModules() function.

Note that it can only be used from within the .onLoad function, but I get an error with load_code() tries to execute env$.onLoad():
"loadRcppModules can only be used within a .onLoad function"

I'm not sure if it makes more sense to instead ask Rcpp to modify these sanity checks. On the other hand, perhaps I'm doing something incorrectly.

Thanks,
Chris

Set a canonical sort order for generating NAMESPACE file

Depending on platform, and LC_COLLATE, the order of exports written to NAMESPACE can change. For a concrete example, in ggplot2, on Windows 7 64 bit (LC_COLLATE=English_United States.1252), I get the order aes, aes_all, aes_string. The file as checked in has aes_all, aes_string, aes. One is not more correct than the other, but after running document() on the package, my NAMESPACE file is marked as changed by git and I have to make sure it does not get included in a commit since there is no functional change there.

This could be solved by locally setting a collation order. See also a recent thread on r-devel "bug in rank(), order(), is.unsorted() on character vector" http://thread.gmane.org/gmane.comp.lang.r.devel/29677

check() runs into latex errors not encountered with R CMD check

When I run R CMD check from the terminal window, the package passes the check. In particular the final check of the PDF version of the manual passes:

R CMD check lubridate
...
* checking PDF version of manual ... OK

But this check fails when I run check() from the R window.

check("lubridate")
...
* checking PDF version of manual ... WARNING
LaTeX errors when creating PDF version.
This typically indicates Rd problems.
* checking PDF version of manual without hyperrefs or index ... ERROR
Re-running with no redirection of stdout/stderr.
Hmm ... looks like a package
Error in texi2dvi("Rd2.tex", pdf = (out_ext == "pdf"), quiet = FALSE,  : 
   pdflatex is not available
You may want to clean up by 'rm -rf /var/folders/sp/hwpvrvw14vj1hqsz16fyj0lh0000gn/T//Rtmpqon3sL/Rd2pdf12e6bc6eac2'
Error: Command failed (1)
Error in running tools::texi2dvi

Everything else looks the same

Garrett

NAMESPACE

devtools should parse the namespace file and only export those functions are exported.

problems with \Sexpr{}?

not sure if you can reproduce this error:

> install_github('scales','hadley')
Installing scales from hadley
Installing scales
* checking for file ‘/tmp/RtmpHi4NOf/hadley-scales-babefa9/DESCRIPTION’ ... OK
* preparing ‘scales’:
* checking DESCRIPTION meta-information ... OK
* installing the package to process help pages
Error: /tmp/RtmpRaOvjY/Rbuild27126899/scales/man/cscale.Rd:14: there is no package called 'munsell'
Execution halted
Error: Command failed (1)

it seems to be caused by \Sexpr{} (I do have munsell installed)

Function in suggested packages not found.

When I load a package I'm developing with load_all(), I get a "could not find function" error for functions from packages listed as suggested in the DESCRIPTION file loaded with require(). Packages that are listed as dependencies work fine though.

devtools package namespace-mangling + S4.

Certain S4-isms, like setClassUnion cause devtools to freak out when the files containing them are loaded via devtools::load_code.

I set up a small package called FooThat you can download and use as an example that "depends on" another S4 package (kernlab).

R ends up wanting to load an installed package named devel:FooThat when the first line in R/AllS4.R is parsed:

setClassUnion("MaybeKernel", c('kernel', 'NULL'))

By stepping through the devtools::load_code function and "unwinding" the calls made by:

plyr::l_ply(paths, sys.source, envir=env, chdir=TRUE, keep.source=TRUE)

I can trigger this backtrace.

Browse[2]> sys.source(paths[1], envir=env, chdir=T, keep.source=T)
Loading required package: devel:FooThat
Error in .requirePackage(package) :
 unable to find required package "devel:FooThat"
In addition: Warning message:
In library(package, lib.loc = lib.loc, character.only = TRUE, logical.return = TRUE,  :
 there is no package called 'devel:FooThat'
Loading required package: devel:FooThat
Error in .requirePackage(package) :
 unable to find required package "devel:FooThat"
In addition: Warning message:
In library(package, lib.loc = lib.loc, character.only = TRUE, logical.return = TRUE,  :
 there is no package called 'devel:FooThat'
Error in setClassUnion("MaybeKernel", c("kernel", "NULL")) :
 unable to create union class:  could not set members "kernel", "NULL"

The problem goes away when the call to setClassUnion is uncommented

dependency on RCurl

I see the only reason for the dependency on RCurl is to download from https, and according to the feedback I received, *nix users often have to spend a few more minutes figuring out how to install RCurl because of its system dependency libcurl (and we need to install the header files).

I guess this dependency can be removed because:

  1. on Windows, you can setInternet2(TRUE)
  2. wget is widely available under Linux, so you can call system('wget ...') to download the zip ball

I do not know Mac, though.

check() does not respect `document = FALSE`

Expected Behavior

Calling

check(pkg = "package.name", document = FALSE)

should not run document() on the package being checked.

Actual Behavior

The check() function currently ignores the value of document passed to it as an argument. Consequently, the document()function is always called on pkg no matter the input to the function:

check <- function(pkg = NULL, document = TRUE) {
  pkg <- as.package(pkg)

  document(pkg, clean = TRUE)
  message("Checking ", pkg$package)

In short, any time a user calls check() on a package that package will be forcibly documented, AND the document() function will always flush the roxygen cache, since it's called with clean = TRUE

devtools::install() fails within RStudio

Hi Hadley,

devtools::install() within Rstudio (Wiindows 64 bits) currently fails.
I suspect it is actually a Rstudio bug: Rstudio masks "utils::install.packages" with its own install..packages function.
I think the issue can be resolved by calling "utils::install.packages" in install.r (in stead of "install.packages")

Best,

Edwin

document() function ignores @rdname tag

Document function will create .Rd files for all package functions from the roxygen tags, ignoring the @rdname tag that may instruct to fold this documentation into that for another function. Also, there is no option to not publish an .Rd file; for example, the roxygen documentation may sit with the function generic, not the .default implementation (in the case of a single class). Maybe this may be viewed as an extension of the @rdname issue.

Error: could not find function build

Any ideas why this would happen?

This doesn't work

build("Zcmarked","~")
Error: could not find function "build"
No suitable frames for recover()

but this does; is build not exported?

devtools:::build("Zcmarked","~")
Loading required package: msm
Loading required package: nlme
Loading required package: graphics
Loading required package: stats
Loading required package: plotrix
This is RMark 2.0.6
Built: R 2.13.0; ; 2011-07-01 17:03:44 UTC; windows

feature request: option to byte-compile when installing

Should be a simple matter of adding a "byte_compile" argument to install_github() which would pass its value along to install() which in turn uses

install.packages(built_path, repos = NULL, type = "source",INSTALL_opts="--byte-compile")

if byte_compile is TRUE, else

install.packages(built_path, repos = NULL, type = "source")

install with alias name of package

I think it will be useful during development of packages if devtools supports installation with the alias name of the package.
For example, assuming installing the develop version of ggplot2 from Hadley's github:

install_github("ggplot2", "hadley", "develop", alias = "ggplot2.had.dev")
library(ggplot2.had.dev)

This is because sometimes I want to use release version, sometimes develop version, and sometimes my own develop version.
Currently this is possibly only by re-installing the package.
But if the alias name is supported, I will use all of them by simply librarying each alias:

library(ggplot2) # load release version
# do something
detach("package:ggplot2", unload=TRUE) # unloat it

library(ggplot2.had.dev) # load dev version
# do same thing and compare the results between release and dev version

Probably this feature will be widely used during development.

directory structure replicated after running document()

Hadley,

I have recently updated to the latest version of devtools (via install_github("devtools")). I am now seeing a complete copy of my package directory structure, including my home directory folders, replicated inside of my package directory after running document("pkg"). The files inside of my original package directory are not modified. This appears to have crept into devtools sometime in the past week. To my knowledge, I have not adjusted or changed anything on my system that would cause this.

I noticed you committed a change to 'Run roxygen in package dir so paths more useful'. I'm guessing the issues are related.

example/details:

Mac OS 10.6
R 2.13
roxygen installed via install_github("roxygen")

home directory (aka ~); /Users/user1
~/.Rpackages:

list(
    default = function(x) {
    file.path("~/r/", x, x)
    }
)

package directory/name: this_pkg

full path for 'this_pkg':/Users/user1/r/this_pkg/this_pkg

After running document("this_pkg") from the R console, the following path is created:

/Users/user1/r/this_pkg/this_pkg/Users/user1/r/this_pkg/this_pkg

visual map of all functions in a package

Aim

Facilitate the understanding of complex packages with a visual overview of all the functions present in a package. Inspiration can be drawn from foodweb in the mcbutils package and graphviz (visualise dependencies between functions in graph from).

Technical considerations

  • arranging different pieces together could be automated using two
    kinds of meta-information: @family from roxygen2, and the network of
    dependencies (e.g using mvbutils::foodweb). A given family of pieces
    could be layout in a column, while inter-dependencies could be
    indicated by a common colour scheme, for example.
  • there could be special markers (e.g a small asterisc) to indicate
    functions that are not exported.
  • there should be a function that returns a grob describing a given
    function (with optional color and asterisk arguments). This grob would
    have widthDetails etc. methods so that it can be placed easily on a
    page.
  • the gridSvg package could be used to animate the output
    interactively; for example highlighting dependencies on mouse-over, showing the function description as a tool-tip, etc.

load_code + error

Currently stops loading files half way, and cache gets out of sync.

Development dependencies

Can devtools figure out when dependencies are development versions and use the dev version instead of the released version?

Load R scripts from data/ folder with load_all()

For (not so) strange and peculiar reason load_all() doesn't load the datasets from data/ folder if they're saved as .R scripts. I have a nice little function that loads the package and sources datasets to package namespace so everything runs smoothly:

l <- function(){
    load_all("<pkgpath>/")
    for (i in dir("<pkgpath>/data", pattern = "\\.R$", full.names = TRUE))
        sys.source(i, envir = as.environment("package:<pkgname>"))
}

I know what you think about loops (that they're awesome), but this works pour moi...

problem setting globals in .onLoad()

I have a package which sets some globals in the .onLoad function but when I try to load the package with devtools I get an error trying to reference the global variables within the .onLoad function itself.

simple example:

.onLoad <- function(lib, pkg, ...) {
  XX <<- "key"
  YY <<- paste(XX, "value",sep="=")
}

Produces the following error in R

$ R
> l(gmutil)
Error in paste(XX, "value", sep = "=") : object 'XX' not found

if I comment out the second line the code loads fine and XX is set in the global environment.

The package itself passes R CMD check with no errors an loads fine when installed...

Safeguards to avoid accidental nuking of project directories

Say a user ill-advisedly runs dev_mode('~/foo') where ~/foo is the same directory as given by the default function in ~/.Rpackages, having misunderstood the path parameter … and then runs install('bar').

The user's new bar package will be installed from ~/foo/bar/ — into ~/foo/bar. This will replace his project directory … and, significantly, discard subdirectories such as, say, .git. You see where this is going.

If the user hasn't already run git push before this, then he's just lost a bunch of work. Just, you know, theoretically speaking. (I'm not at all in pain, and if I were, I would not admit it.)

To avoid such accidents:

  • dev_mode could check whether its path parameter is the same as the directory given by the default function in ~/.Rpackages, or the parent of any other directories mapped in that file, and raise an error.
  • install could check whether the directory it's about to overwrite contains a VCS subdirectory like .git, .hg, .svn, etc., and raise an error (or maybe preserve such subdirectories).

Folding roxygen(2) into the install_github() workflow

It would be nice if there were a way to signal to install_github() that some script should be run before installing, or at least that roxygen should be used to create Rd files and NAMESPACE. That way, derived files don't need to be kept in the repository.

check should optionally resave data for compression

R CMD check now throws a warning when data files are not sufficiently compressed. It would be nice to have a devtools command to save the data in the optimally compressed format. Perhaps this could be run as an option in check().

Build needs libpaths set

e.g. from install.packages:

if (length(libpath)) 
  if (.Platform$OS.type == "windows") {
      oldrlibs <- Sys.getenv("R_LIBS")
      Sys.setenv(R_LIBS = libpath)
      on.exit(Sys.setenv(R_LIBS = oldrlibs))
  }
  else cmd0 <- paste(paste("R_LIBS", shQuote(libpath), 
      sep = "="), cmd0)

test() won't work after check() when S4 methods are involved

...and vice versa

For example, after running

check("lubridate")

test produces the following error

test("lubridate")
Loading lubridate
Error: package slot missing from signature for generic ‘[<-’
  and classes Interval, ANY
  cannot use with duplicate class names (the package may need to be re-installed)

It seems test is trying to reload the S4 methods of lubridate, but gets confused because the methods are already loaded from check(). As a result, test isn't sure how to differentiate the methods it wants to load. The same thing happens when I run check() after test().

The error comes from R's behavior: The same thing happens outside of devtools if I try redefining existing S4 methods. I can use removeMethod("[<-,Interval,ANY") to postpone the error message until R encounters the next pre-existing S4 method - but I really need to remove all at once.

Can check() and test() remove S4 methods on exit?

Garrett

Better recreate behaviour of examples in R CMD check

  • tools:::.createExdotR and tools:::massageExamples

  • run_one_arch in tools:::.check_packages:

    status <- R_runR(NULL, c(Ropts, enc), c("LANGUAGE=en", 
        if (nzchar(arch)) env0, jitstr), stdout = exout, 
        stderr = exout, stdin = exfile, arch = arch)
    if (status) {
        errorLog(Log, "Running examples in ", sQuote(exfile), 
          " failed")
        txt <- paste(readLines(exout, warn = FALSE), 
          collapse = "\n")
        chunks <- strsplit(txt, "> ### \\* [^\n]+\n> \n> flush[^\n]+\n> \n", 
          useBytes = TRUE)[[1L]]
        if ((ll <- length(chunks)) >= 2) {
          printLog(Log, "The error most likely occurred in:\n\n")
          printLog0(Log, chunks[ll], "\n")
        }
        else {
          printLog(Log, "The error occurred in:\n\n")
          printLog0(Log, txt, "\n")
        }
        do_exit(1L)
    }

Vectorise install_github

To simplify (e.g.)

pkgs <- list(hadley = c('productplots', 'scales', 'densityvis'),
             ggobi = c('objectSignals', 'plumbr', 'qtpaint', 'cranvas'))
for (repo in names(pkgs)) {
  for (pkg in pkgs[[repo]]) install_github(pkg, repo)
}

find_package should use special case first, then default, if it exists

At the moment, find_package looks for a default value, and then uses it a file exists in that location. I think this logic should be reversed, i.e. accept the special case, if it exists, then look in the default location.

My use case is the following:

  1. I have been working on a package that uses github as the repository. Most of my packages used with devtools sits in a special folder, in this case f:\git\pkg\pkg
  2. But now I want to switch the repository to R-Forge, so I have created a duplicate of my function in a different folder, in this case f:\svn\pkg\pkg
  3. This means that the default value in ~./Rpackages (pointing to f:\git...) points to an old copy of the file. However, find_packages thinks this is valid.

So, I propose the logic is:

  1. Use the special case, if it exists.
  2. Use the default, if it exists.

The following code should effect that change (simple reversal in order between the last two blocks):

find_package <- function(x) {
desc_path <- file.path(x, "DESCRIPTION")
if (file.exists(x) && file.exists(desc_path))
return(x)

If .Rpackages exists, use that to find the package locations

config_path <- path.expand("~/.Rpackages")
if (file.exists(config_path)) {
lookup <- source(config_path)$value

# Try special cases, if it exists
if (!is.null(lookup[[x]]))
  return(lookup[[x]])

# Otherwise try default function, if it exists
if (!is.null(lookup$default)) {
  default_loc <- lookup$default(x)
  if (file.exists(default_loc)) 
    return(default_loc)
}

}
NULL
}

Post-release

After package uploaded to CRAN:

  • tag using scm
  • produce announcement email
  • ... ?

Test selected files

Extend test to have second argument which gives regular expression to match to test file names.

install_github error

Having sucessfully used install_github previously, I know get the error

> install_github('test_that')
Installing test_that from hadley
Error in function (type, msg, asError = TRUE)  : 
  Unknown SSL protocol error in connection to nodeload.github.com:443

It is the same for a number of packages

> install_github('lubridate')
Installing test_that from hadley
Error in function (type, msg, asError = TRUE)  : 
  Unknown SSL protocol error in connection to nodeload.github.com:443

save binary build zip file to a specific directory

Is there a way to have install save the binay build to the directory containing the package source directory.

package
package.zip
package
R
man
src
...

Many of my packages are only intended for use by one or a few individuals because they are centered on a specific analysis. I want to be able to easily post the windows binary on github for collaborators to download rather than releasing it on CRAN. Right now the install doesn't create the binary and if it did it looks like it would be in a temp directory that is not that easily accessible. What I want is the equivalent for install of

devtools:::build("CIPinnipedAnalysis","C:/users/jlaake/git/CIPinnipedAnalysis")

where it puts .zip file into the source pkg directory but I'd prefer to not have to type in the directory as it knows which one it is working with. So something like install(build=TRUE) would be great.

Right now I have to do this with a command window, cd to directory containing source package and r cmd INSTALL CIPinnipedAnalysis --build

Thanks for considering this. --jeff

Possible bug in devtools

Install_github fails on a Mac (OS X 10.7.2) with an error. Typical example:

 install_github('animation', 'yihui')
Installing animation from yihui
Installing animation
* checking for file '/private/var/folders/j4/vll_yjq95076jc7bt3nq8z_m0000gn/T/Rtmpq2JK8B/yihui-animation-6de5df9/DESCRIPTION' ... OK
* preparing 'animation':
* checking DESCRIPTION meta-information ... OK
* checking for LF line-endings in source and make files
* checking for empty or unneeded directories
* looking to see if a 'data/datalist' file should be added
* building 'animation_2.0-6.tar.gz'

Warning: invalid package '/var/folders/j4/vll_yjq95076jc7bt3nq8z_m0000gn/T//Rtmpq2JK8B/animation_2.0-6.tar.gz'
Error: ERROR: no packages specified
Warning message:
In install.packages(built_path, repos = NULL, type = "source") :
  installation of package '/var/folders/j4/vll_yjq95076jc7bt3nq8z_m0000gn/T//Rtmpq2JK8B/animation_2.0-6.tar.gz' had non-zero exit status

check produces no output

When I run check on my project is produces the following output in my R console:

Updating simar documentation
Writing file1.Rd
Writing file2.Rd
....
Checking simar
Error: Command failed (1)

I would like to see the verbose output produced by R CMD check showing all the warnings and erros. Is this possible?

Use system2

Instead of current shell/system combo, and add option for environmental variables. Probably default to LC_COLLATE=C, etc.

source_gist

Feature request: Very simple interface to run some codes on gist.
Probably I can implement If you agree with including this feature in devtools.
I'm not sure this feature is suitable for devtools, though...

something like:

source_gist("1653395")
source_gist("1653395", "rpipe.r")

etc.

devtools wiki conflicts with `R CMD check` recommendation

The wiki page Package-basics says “the inst/demo/ directory contains larger scale demos, that use many features of the package.” But when I put a file there, R CMD check yields a warning:

Found the following non-empty subdirectories of 'inst' also used by R:
inst/demo
It is recommended not to interfere with package subdirectories used by
R.

Looking at ggplot2, plyr, roxygen2 and zoo, none of them have an inst/demo directory. zoo has a demo directory at top level. Is that the way to go?

install_github does not automatically install packages listed under "imports"

Perhaps I have some configuration wrong somewhere, but install_github is not asking me to install packages imported by the target package, instead just throwing an error when it attempts the load step during install:

** preparing package for lazy loading
Error in loadNamespace(i, c(lib.loc, .libPaths())) :
there is no package called 'Hmisc'
 ERROR: lazy loading failed for package ‘pdgControl’

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.