jeffrey-hokanson / psdr Goto Github PK
View Code? Open in Web Editor NEWParameter space dimension reduction toolbox
License: GNU Affero General Public License v3.0
Parameter space dimension reduction toolbox
License: GNU Affero General Public License v3.0
A paper by den Hertog and Stehouwer describes four different problems involving a color CRT television design problem. These problems have 7, 23, 5, and 14 different parameters and as such, are a good candidate for parameter space dimension reduction. The code was expensive to run in 2002 (several hours).
Following Austin et al, implement a multivariate Arnoldi algorithm following Brubeck, Nakatsukasa, & Trefethen. This should hopefully correct the conditioning issues inherent in constructing multivariate polynomial bases. In particular, this will improve the conditioning of PolynomialRidgeApproximation allowing potentially higher order accuracy.
Resulting fit looks incorrect. See /PSDR/docs/source/ASAP.ipynb
pra = psdr.PolynomialRidgeApproximation(degree = 1,subspace_dimension = 1)
For the minimax design algorithm based on clustering [MJ18], we could replace the approximate max-norm center computed using the q-norm, with an exact max-center using CVXPY. To improve performance when there are a large number of points, we could incorporate Blitz [JG15] to estimate a working set and use Ritter's Bounding Sphere algorithm to initialize the feasible point.
Modify the NACA0012 test problem to include this parameterization. I'd like to use piecewise linear boxes and encode a negative curvature constraint by requiring that for each set of three adjacent points, the finite-difference approximation of the second derivative is negative for the upper surface and positive for lower surface.
An alternating algorithm for constructing a ridge approximation using a GP ridge profile.
See paper: https://arxiv.org/abs/1802.00515
The function initial_sample
tries to provide points uniformly sampled in the space defined by the metric L on the space. In an earlier implementation this was done by constructing samples uniformly randomly from convex combinations. Unfortunately, numerical experiments suggested this performed worse than simple random sampling. However, the fix I have adopted (sampling on the convex hull of these low-dimensional points) scales exponentially in dimension. Hence, the heuristic instead simply choses randomly in this instance
Several of the tutorial notebooks use an older style of handling functions and constraints. Update these examples to reference the more modern approach (e.g., fun.domain) in which normalization is handled transparently.
A test problem with 6 variables based on an actuator for a dot-matrix printer. See Sec. 6, in
Large Sample Properties of Simulations Using Latin Hypercube Sampling. Michael Stein, Technometrics, Vol. 29, No. 2 (May, 1987), pp. 143-151
Hi @jeffrey-hokanson, I found this library last week and I think it's great. I would like to compare the polynomial approximation to machine learning approaches like neural networks. However, I tried running the library and got an error when trying to use 50,000 samples. The SVD in PolynomialRidgeApproximation._finish
requires too much memory and I was wondering if it's really necessary. Could it be replaced with linalg.lstsq
? I don't fully understand what is going on there, but from the description it's trying to find the c
coefficients that match the optimal U
, isn't it?
Reading this paper on parameter space dimension reduction, they identify an active subspace in propeller blade design. Two pieces of code are used:
Generating propeller blade profile: https://github.com/mathLab/BladeX
Flow solution: PROCAL (cannot find source code)
The performance of OuterProductGradient depends heavily on the accuracy of local_linear, which in turn depends heavily on the choice of bandwidth for the kernel weighting the importance of the other samples based on the distance in the ambient space. The perplexity based method works better than the static bandwidth recommended by Li18, but there may be better approaches. There is a wide literature on bandwidth selection techniques, see, e.g., kedd documentation (an R package) for a brief summary of many. Unfortunately, most of these are quadratic in the number of samples and don't seem to suggest a per-sample based bandwidth.
See documentation from scipy: https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.HalfspaceIntersection.html#scipy.spatial.HalfspaceIntersection
Idea: The structure of LinIneqDomain yields a series of halfplanes that define the space. We can think of the equidistant constraints for Voronoi diagrams as similarly defining halfplanes. Thus enumerating through the points x, we can construct the bounded Voronoi diagram. Likely this is not feasible for more than a few dimensions.
See /PSDR/docs/source/ASAP.ipynb
dom2 = dom.add_constraints(A_eq = pra.U.T, b_eq = [0])
dom2.sample(1)
This is an example appearing in Active Subspaces for Shape Optimization. It is unclear how they are setting up the free form deformation (FFD) box for this problem, however there are similar examples in the SU2 documentation and appears in the optimization test cases, however in this example they seem to have used only one design parameter.
Note this is a 3D fluids problem and as parameterized has 198 parameters.
See https://onlinelibrary.wiley.com/doi/pdf/10.1002/sam.11355 eq. 39, 40 and table 2.
UQLab is a Matlab based toolbox for uncertainty quantification developed out of ETH Zurich. Of particular importance is comparisons for reliability based design optimization (RBDO) in comparison to those based on ridge approximation implemented in this library.
When printing domain information about the domain, the code currently renders this information in ASCII; e.g.,
>> print(dom)
<LinIneqDomain on R^8; 1 linear equality constraints>
However, it would be nice if we could use Unicode to render a nicer text description; e.g.,
>> print(dom)
<LinIneqDomain on ℝ⁸; 1 linear equality constraints>
Pros:
Cons:
See /PSDR/docs/source/ASAP.ipynb
pra = psdr.PolynomialRidgeApproximation(degree = 4,subspace_dimension = 1, bound = 'upper')
In the current design of ConvexHullDomain, we cannot add constraints of the form:
This should be simple to support by extending the constraints when solving the LPs for each of the main problems.
Write a short (250-1000 word) paper.md
and place into the repository.
See instructions at: https://joss.readthedocs.io/en/latest/submitting.html
See paper [HA14]: Geometric Programming for Aircraft Design Optimization by Warren Hoburg and Pieter Abbeel in AIAA J 2014.
Although all their examples are geometric programs that are convex after a change in coordinates, these are useful test problems for treating as non-convex programs.
See paper on this topic PIAL10.
See JUQ paper: "Modified Active Subspaces Using the Average of Gradients".
The corners of the domain seem to provide useful points to sample. It would be helpful to enumerate all of these. There seems to be a library cdd
with python bindings that could be useful to do this. https://github.com/mcmtroffaes/pycddlib
See discussion of setup in RK20 Sec. 3.4.
Implement the Elliptic PDE example described in https://epubs.siam.org/doi/10.1137/130916138. Old implementation in VarProRidge uses numpad.
Implement the algorithm proposed in Gradient free active subspace construction using Morris screening elementary effects by Lewis, Smith, and Williams.
This will likely require implementing more generic screening methods appropriate for coordinate-based dimension reduction.
The current MULTI-F docker image is larger than 5GB. This is causing testing to be slow
as travis-ci re-downloads this every time and will be a barrier to other users. So the goal should be to reduce the size of this image.
In sobol_seq, the inverse survival function is used to map the [0,1] Sobol sequence to a domain specified by a multivariate normal; generalize this to the random domains generated allowed by our library.
The GP class should also allow the user to compute the gradient.
See discussion in section 4.1 of https://doi.org/10.1002/sam.11355
The current quadrature rule is based on tensor product Gauss rules or a simple Monte-Carlo rule. It would be nice to have better rules, perhaps invoking the Lipschitz matrix to weight directions appropriately.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.