bessagroup / f3dasm Goto Github PK
View Code? Open in Web Editor NEWFramework for Data-Driven Design & Analysis of Structures & Materials (F3DASM)
Home Page: https://f3dasm.readthedocs.io/
License: BSD 3-Clause "New" or "Revised" License
Framework for Data-Driven Design & Analysis of Structures & Materials (F3DASM)
Home Page: https://f3dasm.readthedocs.io/
License: BSD 3-Clause "New" or "Revised" License
Decide what to do with the content of the uml folder. Are the files relevant for future development?
# TODO: evaluation is not defined at (x, y) = (0, 0)
continuous = False #TODO: change to True
Hi,
Which optimization algorithm do you use?
I wonder how you conduct optimization algorithm in this project.
Could you explain it?
Best!
continuous = False #TODO: change to True
Organize and consolidate information, examples and tutorials in the docs folder.
The name of the function might be misleading . I expected for the function to keep adding outputs (y1, y2 ..etc.) but instead it assigns values to the already created output space.
TU Delft has a new software policy for open-source software, and it has a specific way to write the license file. Do we adopt that? or do we keep the current license?
Dear Dr Bessa:
Excuse me,STEP2a_ML_Classification.py and STEP2b_ML_Regression.py are python files,but others are matlab files,then how to use them?
Does UQlab need to be installed?
Best regards!
# TODO: this function is incompatible with a 1D design,
I feel that some parts of the execution code are structured in a complex way, and the intuition might be lost to the reader. Referring to an example of multiple realizations of SGA from the tutorial practical_session_solutions.ipynb
:
# Define the blocks:
dimensionality = 20
iterations = 600
realizations = 10
hyperparameters= {} # If none are selected, the default ones are used
domain = np.tile([-1., 1.], (dimensionality,1))
design = f3dasm.make_nd_continuous_design(bounds=domain, dimensionality=dimensionality)
data = f3dasm.Data(design)
# We can put them in a dictionary if we want
implementation = {
'realizations': realizations,
'optimizer': f3dasm.optimization.CMAES(data=data, hyperparameters=hyperparameters),
'function': f3dasm.functions.Ackley(dimensionality=dimensionality, noise=False, scale_bounds=domain),
'sampler': f3dasm.sampling.LatinHypercubeSampling(design, seed=seed),
'iterations': iterations,
}
results = f3dasm.run_multiple_realizations(**implementation)
It would be helpful if there aren't many complex arguments of domain
, design
, and data
(which take one another as argument for instantiation) used again in dictionary implementation
.
Here are my comments regarding the first F3DASM tutorial session hosted by Martin.
Implement a new version of the DoE component based on the agreements on the 25-Oct.
Materials can be represented in several ways. Which representation makes more sense for the cases in F3DASM?
The representation should be:
Here is one possibility :
mat1 = {'elements': [ {'name': 'STEEL', 'params': {'param1': 1, 'param2': 2}},
{'name': 'CARBON', 'params': {'param1': 3, 'param2': 4, 'param3': 'value3'} }
]
}
# TODO: add possibility to insert ONLY input numpy array (leave output array as NaNs)
continuous = False #TODO: change to True
When passing non-numeric values to the DATA class, the method torch_sensor()
produces the following error:
return torch.tensor(self.values).float()
TypeError: can't convert np.ndarray of type numpy.object_.
The only supported types are: float64, float32, float16, complex64,
complex128, int64, int32, int16, int8, uint8, and bool.
We need a fix for this. For now, we remove the line of code that triggers the error.
I noticed some undesired behaviour regarding the Sobol sampling. Basically, if we look at the Example 1 in the pull request mentioned above, 'Fs' is sampled with Sobol in 3 dimensions (F11, F12, F22), but in the final doe, these dimensions become squeezed to a single one - F1s, (I believe that happens in 'create_combinations' in doevars.py line 102), and as a result the number of doe points goes from desired 5 to 5 * #dim = 15 in this case, and the overall # of doe is 270, but should be 90. In other words, I think these F11, F12, F22 should appear each in a separate column instead of squeezing them together. Maybe some multi-indexing from pandas could be a solution.
A requirements file is missing for this implementation. Must be added before the new release
Extend code in the simulator to implement examples using Abaqus. Use Luis code as a starting point.
continuous = False #TODO: change to True
# TODO: change noise calculation to work with autograd.numpy @mpvanderschelling
Input scaling doesn't work with dimensionality 3 or higher.
continuous = False #TODO: change to True
The case Gawel's is currently working on requires defining problems and geometries in ways different than the cases for suppercompressibles and hyperelastic.
We should discuss and agree on how to handle such cases and what changes to DoeVars.py are required.
Overall, I think comments 2 and 3 are more urgent for the user, because it makes is not so friendly to user.
x0 = f3dasm.ContinuousParameter(name='x0', lower_bound=-1.0, upper_bound=1.0)
x1= f3dasm.ContinuousParameter(name='x0', lower_bound=-1.0, upper_bound=1.0)
Accessing samples by
rsampler = f3dasm.sampling.RandomUniformSampling(design, seed)
samples_random = rsampler.get_samples(30)
causes the following error:
"KeyError: "Only a column name can be used for the key in a dtype mappings argument. '('input', 'x0')' not found in columns."
It would be nice if the error is more helpful!
# TODO: this function is incompatible with a 1D design,
Implement declaration of sampling method as part of each DoE parameter:
Example:
vars ={ salib_solob({'f11': [1,2] ..... , n=10}),
materil: {}
}
The above syntax is not possible, because, in Python, a dictionary must define key:value pairs. This means that the above example has to be translated in something like: vars = {"param1": salib_sobol( { 'f11':[1,2]...., n=10} ), material: {} }
. In other words, we will have to define a key for every parameter that aims to use a sampling method. This will make the whole dictionary difficult to pars and distinguish which keys are names for the columns of the data frame, and which keys aren't.
Below, I propose some options to deal with this issue considering the less impact on the current code base. Any suggestions or ideas on a better way to deal with this?
It is not clear if the function calculates the point closest to the global minima or the one that is closest in value to the global minima.
Implement a new and more modular structure for the package based on the discussion and the group's agreements on 25-October
Excuse me,do you encounter the problem of compatiblity?
You seem to use abaqus python script and python 3 libraries.
Now abaqus uses python 2,but most libraries are python 3,how to integrate abaqus python script with python 3 libraries? They are often uncompatible.
Best!
Write a new readme file for the new release
def evaluate(self, X): #TODO: correct formula should be np.sum((X - 1) ** 2) - np.sum(X[1:] * X[:-1]). Remove unused d and i variables.
Integrate commit by Taylan by creating a pull request to a patch branch and emerging it to the new-structure branch
continuous = False #TODO: change to True
Personally, I delve into a documentation center when learning a new software. The current documentation center appears to have all the nuts and bolts (I like this), but the intuitive navigation of the code is not there for me. I want to suggest the following for improved readability:
It would be nice to know what all packages will be installed (when installing F3DASM) and how much space it would take up.
## TODO: why is this check necessary?
Thank you for the session. It was interactive, helpful and insightful. Here are some thoughts of things I would do again, and items I would change:
I have faced errors during the installation of f3dasm
, which made me dig into the installation a bit more in detail. The following suggestions wont necessarily fix the issue I am facing, but would definitely be a clean-up, and also make installation more straightforward for others.
I see that there are many (slightly more than usual) packages in the f3dasm_environment.yml
file, and probably these are packages that got installed as additional packages. If this is the case:
I recommend that the yaml
file is exported as:
conda env export -n f3dasm_env --from-history -f f3dasm_environment.yml
Of course, this wouldn't list the pip
installations. For that, do a regular export as:
conda env export -n f3dasm_env -f f3dasm_environment_pip.yml
Manually update the f3dasm_environment.yml
to have the pip
installation list from f3dasm_environment_pip.yml
pip install -e .
the environment has to be activated. So,conda activate f3dasm_env
pip install -e .
Importing f3dasm
fails because setup.py
doesn't install tensorflow
as a dependency. I also see in that in setup.py
there are some packages which the preferred way are installed through via conda
. It would be a good idea to separate out the pip
installations and conda
installations.
Furthermore, it might be worth defining the version of the packages in the setup.py
, unless it is left out on purpose.
pip install -e .
the environment has to be activated. So,conda activate f3dasm_env
pip install -e .
If the preferred way and if things fail are updated as suggested above, the Installing all packages manually would become redundant. For if things fail, I suggest that conda
installations happens first, followed by installation through setup.py
, which only installs packages that are additionally required (i.e. redundancies removed from setup.py
).
This import statement is commented
Extend code in the simtulator to implement Fenics example using Taylan's code as starting point
I feel that it would be a good practice to freeze the version at the stage of a tutorial so that the jupyter notebooks can be reproduced in the exact same way as it was done during the tutorial. I can see from release notes that this was the intention, i.e. v1.0.0
was supposed to be frozen for the tutorial. However some commits have slipped through this. For example, I have figured out that commit bfd8395, made to v1.0.0
has refactored scaling of functions. This influences the results of the tutorial, due to which there a mismatch between the tutorial and the notebook practical_session_solutions.ipynb
.
Suggestion:
If the version v1.0.0
was frozen at that stage of the tutorial, and all future changes were made to version v.1.0.1
for example, this issue could have been avoided.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.