Giter Site home page Giter Site logo

mappp's Introduction

👋 I am an Associate Professor of Pediatrics at Cincinnati Children’s Hospital Medical Center and the University of Cincinnati. As a biostatistician, epidemiologist, and geospatial data scientist, I have specialized myself in the areas of informatics and machine learning with applications to population-level environmental, community, and health outcome data. I develop new methods and technologies to support environmental and population health research, including tools for geocoding and geomarker assessment, high resolution spatiotemporal exposure assessment models, and causal inference machine learning methods. I lead research on the roles of environmental exposures and community characteristics on pediatric psychiatric health by applying these methods and tools to large databases of electronic health records, observational cohort studies, clinical registries, and vital records.

mappp's People

Contributors

cole-brokamp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

izahn rnaimehaom

mappp's Issues

Consider using furrr instead

https://github.com/DavisVaughan/furrr provides parallel map with progress. It is part of the futureverse, meaning that it can use a wide variety of parallel back-ends. My personal opinion (which may be worth nothing of course, you decide) is that it would be better to contribute any missing features to furrr rather than duplicate the effort.

better error capture

when using safely, make better error messages to tell which of the list items produced errors; this makes it easier to debug when using map-like functions

prevent purrr dependency?

Instead of using as mapper, just endure item is a list

Could list coercion be why it is sometimes so memory intensive?

Use parallelly for better detection of available cores

I'm using DeGAUSS to geocode locally and a University HPC cluster. As is common in such an environment the machines have a large number of CPUs but I am limited to a subset of them. Unfortunately parallel::detectCores() has no support for this, and just returns the number of physical cores. This is bad because DeGAUSS ends up running 132 processes when I only have access to 4 cores!

The https://github.com/HenrikBengtsson/parallelly package offers an availableCores() function[1] that addresses this limitation. Would you be open to a PR swapping out the parallel::detectCores bit with paralelly::availableCores?

[1] https://github.com/HenrikBengtsson/parallelly/blob/develop/R/availableCores.R

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.