Giter Site home page Giter Site logo

bnt's Introduction

Bayes Net Toolbox for Matlab

Written by Kevin Murphy, 1997--2014. Last updated: 28 June 2014. As of June 2014, this is maintained on Github at https://github.com/bayesnet/bnt

bnt's People

Contributors

alproc2 avatar engineero avatar flodenk avatar ido avatar jtylka avatar murphyk avatar sachag678 avatar wangxin1205 avatar yogevkr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bnt's Issues

training Gaussian Mixture Model using multiple sequences

From [email protected] on January 08, 2012 02:51:59

in the demo given ( http://www.media.mit.edu/wearables/mithril/BNT/mixtureBNT.txt ), there is only one sequence for training and 1 sequence for test. How about if we want to train with multiple sequences? should we add the intra-class sequences into the same sequence or there is another procedure? If the action is non-periodic (like standing up) then adding into the same sequence definitely cause problem.
appreciate your help on this.

Original issue: http://code.google.com/p/bnt/issues/detail?id=23

Seemingly missing functions from the source code

From [email protected] on July 08, 2010 16:43:27

What steps will reproduce the problem? running mk_named_CPT results in an error due to the function 'stringmatch' being missing
using gibbs sampling inference there is an error due to the function 'compute_posterior' being missing What is the expected output? What do you see instead? What version of the product are you using?1.07 On what operating system? Win XP (SP3) Please provide any additional information below. Replacing the reference to 'stringmatch' with 'strmatch' and including the 'exact' option appears to fix the first problem.
The second problem appears to be solved by running 'installC_BNT' after changing the directory in the file and making pathnames consistent with windows, i.e. replace all / with . (This contradicts the statement that this method is no longer needed).

Original issue: http://code.google.com/p/bnt/issues/detail?id=5

Variable workspacfunc on Matlab2015a Ubuntu

When running the mixtureBNT.m tutorial an Exception appears:

error

This window blocks the usage of the editor so it is very annoying to close it every time the script is run. It also shows when trying to view the variable contents in the workspace.

>> mixtureBNT

ans =

   570    31


ans =

   120    31

EM iteration 1, ll = -20253.8215
EM iteration 2, ll = 6959.8467
EM iteration 3, ll = 6959.8470
EM iteration 4, ll = 6959.8479
EM iteration 5, ll = 6959.8514
EM iteration 6, ll = 6959.8654
EM iteration 7, ll = 6959.9158
EM iteration 8, ll = 6960.0552
EM iteration 9, ll = 6960.2468
EM iteration 10, ll = 6960.2868

Current plot held
Current plot held
Exception in thread "AWT-EventQueue-0" java.lang.ClassCastException: [D cannot be cast to [Z
    at com.mathworks.mlwidgets.array.ValuePanel$ROML.matlabEvent(ValuePanel.java:251)
    at com.mathworks.jmi.MatlabMCR$AWTReplyEvent.run(MatlabMCR.java:1636)
    at java.awt.event.InvocationEvent.dispatch(Unknown Source)
    at java.awt.EventQueue.dispatchEventImpl(Unknown Source)
    at java.awt.EventQueue.access$200(Unknown Source)
    at java.awt.EventQueue$3.run(Unknown Source)
    at java.awt.EventQueue$3.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.security.ProtectionDomain$1.doIntersectionPrivilege(Unknown Source)
    at java.awt.EventQueue.dispatchEvent(Unknown Source)
    at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source)
    at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)
    at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)
    at java.awt.WaitDispatchSupport$2.run(Unknown Source)
    at java.awt.WaitDispatchSupport$4.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.awt.WaitDispatchSupport.enter(Unknown Source)
    at java.awt.Dialog.show(Unknown Source)
    at com.mathworks.mwswing.MJDialog.show(MJDialog.java:311)
    at java.awt.Component.show(Unknown Source)
    at java.awt.Component.setVisible(Unknown Source)
    at java.awt.Window.setVisible(Unknown Source)
    at java.awt.Dialog.setVisible(Unknown Source)
    at com.mathworks.mwswing.MJOptionPane.showOptionDialog(MJOptionPane.java:539)
    at com.mathworks.mwswing.MJOptionPane.showMessageDialog(MJOptionPane.java:435)
    at com.mathworks.mwswing.MJOptionPane.showMessageDialog(MJOptionPane.java:425)
    at com.mathworks.mlwidgets.array.ArrayDialog$1.run(ArrayDialog.java:49)
    at java.awt.event.InvocationEvent.dispatch(Unknown Source)
    at java.awt.EventQueue.dispatchEventImpl(Unknown Source)
    at java.awt.EventQueue.access$200(Unknown Source)
    at java.awt.EventQueue$3.run(Unknown Source)
    at java.awt.EventQueue$3.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.security.ProtectionDomain$1.doIntersectionPrivilege(Unknown Source)
    at java.awt.EventQueue.dispatchEvent(Unknown Source)
    at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source)
    at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)
    at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
    at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
    at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
    at java.awt.EventDispatchThread.run(Unknown Source)

BNT example code problem!!!

From [email protected] on May 09, 2010 09:31:43

What steps will reproduce the problem? 1. I am trying to run the example which is on this link
" http://bnt.googlecode.com/svn/trunk/docs/usage.html " but I got some
errors. I attached the file please run it and tell me if I did any mistake.

  1. I want to Marginalize the "S" node w.r.t "R".
  2. And I am getting these errors in MATLAB. I am pasting below:
    ??? Index exceeds matrix dimensions.

Error in ==> discrete_CPD.convert_to_table at 14
T = CPT(index{:});

Error in ==> discrete_CPD.convert_to_pot at 20
T = convert_to_table(CPD, domain, evidence);

Error in ==> jtree_inf_engine.enter_evidence at 57
pot{n} = convert_to_pot(bnet.CPD{e}, pot_type, fam(:), evidence);

Error in ==> BN_cct1 at 15
[engine, loglik] = enter_evidence(engine, evidence); What is the expected output? What do you see instead? Ans: Expected output should be a value of marginal probability. What version of the product are you using? On what operating system? MATLAB 7.6.0 ( R2008a ) at VISTA HOME PREMIUM. Please provide any additional information below.

Attachment: BN_cct1.m

Original issue: http://code.google.com/p/bnt/issues/detail?id=3

There might be a bug in jtree_inf_engine and var_elim_inf_engine

Hi,

I am not sure whether it is a bug in junction tree inference and variable elimination.

For the following BN model, when computing marginal for each node given node 1, I tested Junction tree inference , variable elimination and pear_inf_engine. I found that the results by Junction tree and variable elimination are incorrect while pear_inf_engine is correct. I have no idea what happens in the Junition tree and the variable elimination engine .

I computed the marginal given node 1 for each other node manually. For node 8 and node 9, only pearl_infer_engine got the right results.

I put the learned bnet model (matlab format) in the attachment.
Model configuration :

  • all nodes are discrete
  • node 1 has 8 states and others are binary

image

## Results : P(A8 = 2 | A1) P(A9 = 2 | A1)
Result : jtree
image

Result : var_elim
image

Result : pearl
image

## *Manual *:
image

code๏ผš
---------------------------ใ€€
jtree_Engine = jtree_inf_engine(bnet) ;
var_Engine = var_elim_inf_engine(bnet) ;
pearl_Engine = pearl_inf_engine(bnet) ;

jtree_conMarg = compMarginal (bnet,jtree_Engine) ;
var_conMarg_ = compMarginal (bnet,var_Engine) ;
pearl_conMarg_ = compMarginal (bnet,pearl_Engine) ;


function [conMarg] = compMarginal (bnet,engine)
%% First node : exp, 8 states, others : AUs, binary
numNodes = size(bnet.dag,1) ;
numAUs = numNodes - 1;
numCls = bnet.node_sizes(1) ;
evidence = cell(1,numNodes) ;
conMarg = zeros(numCls,numAUs) ;
for i = 1 : numCls
evidence{1} = i ;
engine = enter_evidence(engine,evidence) ;
for j = 1 : numAUs
temNodeInd = j + 1 ;
temMarg = marginal_nodes(engine,temNodeInd) ;
conMarg(i,j) = temMarg.T(2) ;
end
end

bnet_learnedModel.zip

How can i edit add_BNT_to_path.m file (Bayes Net Toolbox)

I want to use Bayes Net Toolbox, but i have problems with the integration of BNT into matlab, I explain, i download the package,i use a program like Winzip. This will create a directory called BNT.the creator of this library, says that we have to modify the file add_BNT_to_path.m. (Edit the file "BNT/add_BNT_to_path.m" so it contains the correct pathname. For example, in Windows, I download BNT.zip into C:\kpmurphy\matlab, and then comment out the second line (with the % character), and uncomment the third line, which reads BNT_HOME = 'C:\kpmurphy\matlab\BNT'Smiley Wink

my questions are: 1) how can i edit this fil? 2) there is a comment in this file "The syntax was changed between matlab 5 and matlab 6", but i use MATLAB Vers 8.5 (R2015a), so what should I do? I tried to solve but I could not, pleas help me?

Error in learning Network Structure using learn_struct_pdag_pc() method

From [email protected] on January 12, 2012 17:53:47

What steps will reproduce the problem? 1. I want to learn the pdag from alarm network data with 200,000 records using learn_struct_pdag_pc() method.

2.the call is like this :
[pdag, G] = learn_struct_pdag_pc('cond_indep_fisher_z',n , n-1 ,CovMatrix,nSamples,alpha)

where
[CovMatrix, nSamples, varfields] = CovMat(data_BNT_path,n)

I got this error :
??? Error using ==> erfc
Input must be real and full.

Error in ==> cond_indep_fisher_z>normcdf at 73
p(k) = 0.5 * erfc( - (x(k) - mu(k)) ./ (sigma(k) * sqrt(2)));

1- I got this error for some other networks.

Original issue: http://code.google.com/p/bnt/issues/detail?id=24

Slow inference on chain-like graph with independent parts.

Greetings,

I am just starting out with Bayesian Networks and I am trying to model a long chain of variables with causal relationships (these are gaussian parameters notes in a musical piece conditional on notes from the past). The network looks like in the following figure. Note that it's not a DBN as the variables are not always the same, rather all of the nodes are distinct things.

http://imgur.com/48kGGsw

C is a discrete node with size 2 and the 'i' variables are gaussian. There are a number of i variables that are not shown in the figure, but it looks like a long chain with C affecting all of the i variables. All of the nodes are observable, but at any point we'll only have observed a part of the i nodes, and I need to calculate a marginal probability distribution for node i given the previous ones. As you can probably see each node is only dependent on the C node and its direct descendent, so as I understand it this makes each i node independent of all previous ones except i-1, such that if I input evidence for i-1 and C, this should be enough to calculate i and i-2, i-3, etc do not affect the marginalization.

However, it seems like when using the jtree engine to do this, the calculation takes longer the further along the chain the processing goes, which leads me to believe that it's including variables further back in the chain that are not important, making the inference incredibly slow.

Am I missing/misunderstanding something or this is a bug in the toolbox? One thing I've tried to do is represent the graph as individual bnets in an array, each of which is a node and its markov blanket (children and direct parents), and training them all separately. The results seem to be the same then, but runs much faster.

Thanks in advance!
Bogdan

Learning problem with 6 variables DBN with only one observable variable. What I'm doing wrong ?

From [email protected] on May 05, 2011 14:45:05

What steps will reproduce the problem? Here the Code :

%% Construct DBN
O = 32; % num observable symbols
ss = 6; % slice size
intra = zeros(ss);

% One observation variable the #1
intra(2,1) = 1;
intra(3,1) = 1;
intra(4,1) = 1;
intra(5,1) = 1;
intra(6,1) = 1;

inter = zeros(ss);
inter(2, [2 3 4 5]) = 1;
inter(3, [2 3 4 6]) = 1;
inter(4, [2 3 4 5 6]) = 1;
inter(5, [2 4 5 6]) = 1;
inter(6, [3 4 5 6]) = 1;

onodes = 1; % observed
dnodes = 1:ss; % discrete
ns = [O 2*ones(1,ss-1)]; % binary nodes except observed node
eclass1 = [1 2 2 2 2 2];
eclass2 = [1 3:7];
eclass = [eclass1 eclass2];

bnet = mk_dbn(intra, inter, ns, 'discrete', dnodes, 'observed', onodes, 'eclass1', eclass1, 'eclass2', eclass2);

%% Set Prior
% Use Uniform Prior :
for e=1:max(eclass)
bnet.CPD{e} = tabular_CPD(bnet, e, 'CPT', 'unif');
end

%% DATA : sequences of 100 observations
k = 300; % Number of samples
T = 100; % Length of samples
data = randi(32,k,T);

%% Learn DBN
engine = smoother_engine(jtree_2TBN_inf_engine(bnet));

ncases = k;%number of examples
cases = cell(1,ncases);
for i=1:ncases
% ev = sample_dbn(bnet, T);
cases{i} = cell(ss,T);
cases{i}(onodes,:) = num2cell(data(onodes,1:T), 1);
end

[bnetH2, LLtrace] = learn_params_dbn_em(engine, cases, 'max_iter', 4); What is the expected output? What do you see instead? Here is the error :

??? Error using ==> reshape
To RESHAPE the number of elements must not change.

Error in ==> myreshape at 10
T = reshape(T, sizes(:)');

Error in ==> dpot.dpot at 21
pot.T = myreshape(T, sizes);

Error in ==> discrete_CPD.convert_to_pot at 28
pot = dpot(domain, ns(domain), T);

Error in ==> jtree_2TBN_inf_engine.fwd at 15
CPDpot{n} = convert_to_pot(bnet.CPD{e}, engine.pot_type, fam(:), ev2);

Error in ==> smoother_engine.enter_evidence at 14
[f{t}, ll(t)] = fwd(engine.tbn_engine, f{t-1}, ev(:,t), t);

Error in ==> learn_params_dbn_em>EM_step at 131
[engine, ll] = enter_evidence(engine, evidence);

Error in ==> learn_params_dbn_em at 82
[engine, loglik, logpost] = EM_step(engine, evidence, temperature);

Error in ==> testDBN1 at 63
[bnetH2, LLtrace] = learn_params_dbn_em(engine, cases, 'max_iter', 4); What version of the product are you using? On what operating system? I'm using Matlab 2010a on a win 7 x64 OS. Please provide any additional information below. The aim is to make classification over some data that look like the one I generated. Soi'd like to learn 2 DBNs over two labeled sets of data. But i'm not even capable of learning one DBN :P

It's been two days that I'm searching for what I have done wrong ... If you can tell me I would be very pleased.

Thank you.

PS: sorry for the mistakes in English ... its not my mother tongue :)

Attachment: testDBN1.m

Original issue: http://code.google.com/p/bnt/issues/detail?id=10

Can not install on my mac os X with btn version 1.0.7

From [email protected] on December 12, 2013 14:45:23

What steps will reproduce the problem? 1. I download and extract full_bnt.1.0.7
2. run addpath(genpath.) as the install manual
3. I can not find test_btn function, it also cause my matlab crash after my addpath() command. What is the expected output? What do you see instead? My matlab crash as my normal function like ls, pwd do not work What version of the product are you using? On what operating system? my laptop macbook air 2013 with mac os x Please provide any additional information below.

Attachment: error

Original issue: http://code.google.com/p/bnt/issues/detail?id=34

Assign values to nodes?

From [email protected] on July 11, 2012 14:54:01

I got the idea that using mk_bnet function to construct the Bayesian network with

conditional probabilities, but I can not figure out how to assign values to those nodes.

The nodes in our problem have both probabilities and values, for example, the rain is

30% to be large which is 100 mm/day.

And the deterministic nodes which are calculated by parent nodes' values, can they be

constructed by deterministic_CPD?

I know this may not be a issue for expert, but please help, thank you!!

Original issue: http://code.google.com/p/bnt/issues/detail?id=27

inference error ; help me please

From [email protected] on September 19, 2011 12:55:47

hi
Email Address:::: [email protected]

help me ; help me plz I write code for Inference but it have error below:::

??? Error using ==> subsindex Function 'subsindex' is not defined for values of class 'cell'.

Error in ==> discrete_CPD.convert_to_table at 14 T = CPT(index{:});

Error in ==> discrete_CPD.convert_to_pot at 20

T = convert_to_table(CPD, domain, evidence);

Error in ==> jtree_inf_engine.enter_evidence at 57

pot{n} = convert_to_pot(bnet.CPD{e}, pot_type, fam(:), evidence);

Error in ==> Rafe_inference at 116 engine,loglik? = enter_evidence(engine,evidence);

my code is :::::

clear all

clc

A?=xlsread('E:\DATA MINING\final_cut.xlsx');

N=6;

dag=zeros(N,N);

AS=1;AM=2;CC=3;SC=4;VT=5;DA=6;

dag(3:6,AS)=1;dag(2,3)=1;dag(3,4)=1;

discrete_nodes=1:N;

node_sizes=3 9 9 21 10?;

onodes=2:6;

bnet=mk_bnet(dag,node_sizes,'observed',onodes);

draw_graph(bnet.dag);

bnet.CPD{AS}=tabular_CPD(bnet,AS);

bnet.CPD{AM}=tabular_CPD(bnet,AM);

bnet.CPD{CC}=tabular_CPD(bnet,CC);

bnet.CPD{SC}=tabular_CPD(bnet,SC);

bnet.CPD{VT}=tabular_CPD(bnet,VT);

bnet.CPD{DA}=tabular_CPD(bnet,DA);

TrainingSamples?=cell(N,size(A,1));

for i = 1 : size(A,1)

TrainingSamples?(1,i)={A(i,1)'};
TrainingSamples?(2,i)={A(i,2)'};
TrainingSamples?(3,i)={A(i,3)'};
TrainingSamples?(4,i)={A(i,4)'};
TrainingSamples?(5,i)={A(i,5)'};
TrainingSamples?(6,i)={A(i,6)'};
end

bnet=learn_params(bnet,TrainingSamples?);

engine = jtree_inf_engine(bnet);

evidence = cell(1,N);

evidence{AM} = {A(i,2)'};

evidence{CC} = {A(i,3)'};

evidence{SC} = {A(i,4)'};

evidence{VT} = {A(i,5)'};

evidence{DA} = {A(i,6)'};

engine,loglik? = enter_evidence(engine,evidence);

marg = marginal_nodes(engine, AS);

Attachment: Rafe_inference.m Rafe.m final_cut.xlsx

Original issue: http://code.google.com/p/bnt/issues/detail?id=18

Neato text compare broken

From [email protected] on January 05, 2012 16:43:51

What steps will reproduce the problem? 1. Run draw_dot(G)
2. Error is returned. 3. What is the expected output? What do you see instead? A graph. Instead I see an error. What version of the product are you using? On what operating system? neato - graphviz version 2.28.0 (20110507.0327) Please provide any additional information below. It looks like the output from neato has changed so that the text compare on line 95 in dot_to_graph.m no longer works. If I change the line to

[node_pos] = sscanf(line(pos_pos:length(line)), 'pos="%f,%f"')';

(from [node_pos] = sscanf(line(pos_pos:length(line)), ' pos = "%d,%d"')';)

then it works. Also, I did not have the range function in my distribution which is called in line 114 and 115 of dot_to_graph.m. I took a guess that it is simply max(x)-min(x) and that seems to work.

Original issue: http://code.google.com/p/bnt/issues/detail?id=22

dynamic bayesian network

From [email protected] on December 24, 2013 02:08:07

hello!
i'm doing research in NATURAL Language processing,recently i've found DBN as a tool to find relations between word features. i began to study DBN according to murphy 2002 thesis. then i've found i have to study more in applied probability so i'm currently understanding it.
i wanted to ask
if i have the fully observed data and i wanted to find the relations between them? and how can i know the CPT. is it using EM algorithm.
how much do i need to get all the concepts of bayes net?
thanks

Original issue: http://code.google.com/p/bnt/issues/detail?id=35

HMM Inference - Urgent; Please Help

From [email protected] on July 11, 2011 19:33:18

What steps will reproduce the problem? 1. Running the code will reproduce the problem What is the expected output? What do you see instead? I am confused with the marginal_nodes function. I am not sure what the node and the t value is supposed to be. I have a HMM-Gaussian, a simple one. I learned parameters and for each of my gaussian value, I want to predict the state [3 hidden states].

So I produced the evidence function and then I want to estimate the marginal values, which unfortunately cannot be done with my error in calling marginal_nodes function. Please let me know what values should be given to arguments nodes and t. [i did not understand the the documents explanation]. What version of the product are you using? On what operating system? mac OS and latest version, which I downloaded from bayesnet website just today. Please provide any additional information below. Mine is a simple univariate gaussian output and discrete hidden states [3 states]. My problem is with the marginal_nodes function.

Attachment: HMM_Continous_Bnet.m

Original issue: http://code.google.com/p/bnt/issues/detail?id=14

Missing Values Crash Learning In learn_params_dbn_em

From [email protected] on July 25, 2012 09:02:19

What steps will reproduce the problem? Running the following code will reproduce the problem:

intra = [0 1; 0 0];
inter = [1 0; 0 0];
num_nodes = 2;
num_states = [2 2];% num of states
dnodes = [1 2]; % indices of discrete nodes
onodes = 2;% indices of observed nodes
eclass1 = [1 2];
eclass2 = [3 2];

N = max([eclass1 eclass2]);
CPT = cell(1,N);

bnet = mk_dbn(intra, inter, num_states, 'discrete', dnodes, ...
'observed', onodes, 'eclass1', eclass1, 'eclass2', eclass2);

bnet.CPD{1} = tabular_CPD(bnet,1);
bnet.CPD{2} = tabular_CPD(bnet,2);
bnet.CPD{3} = tabular_CPD(bnet,3);
engine = smoother_engine(jtree_2TBN_inf_engine(bnet));

ss = 2;%slice size(ss)
ncases = 10;%number of examples
T=10;
max_iter=2;%iterations for EM
cases = cell(1, ncases);
for i=1:ncases
ev = sample_dbn(bnet, T);
cases{i} = cell(ss,T);
cases{i}(onodes,:) = ev(onodes, :);
cases{i}{2,3} = [];
cases{i}{2,4} = [];
end

[bnet2 LLTrace]= learn_params_dbn_em(engine, cases);

for i=1:N
s=struct(bnet2.CPD{i});
CPT{i}=s.CPT;
end
celldisp(CPT); What is the expected output? What do you see instead? I expect to see CPTs printed to the screen for the prior, transition, and emission probabilities. Instead, I get the following error:

Error using .*
Matrix dimensions must agree.

Error in mult_by_table (line 7)
bigT(:) = bigT(:) .* Ts(:); % must have bigT(:) on LHS to preserve
shape

Error in dpot/multiply_by_pot (line 11)
Tbig.T = mult_by_table(Tbig.T, Tbig.domain, Tbig.sizes,
Tsmall.T, Tsmall.domain, Tsmall.sizes);

Error in jtree_inf_engine/init_pot (line 17)
clpot{c} = multiply_by_pot(clpot{c}, pots{i});

Error in jtree_2TBN_inf_engine/fwd (line 36)
[f.clpot, f.seppot] = init_pot(engine.jtree_engine, clqs, pots,
engine.pot_type, engine.observed);

Error in smoother_engine/enter_evidence (line 14)
[f{t}, ll(t)] = fwd(engine.tbn_engine, f{t-1}, ev(:,t), t);

Error in learn_params_dbn_em>EM_step (line 131)
[engine, ll] = enter_evidence(engine, evidence);

Error in learn_params_dbn_em (line 82)
[engine, loglik, logpost] = EM_step(engine, evidence,
temperature);

Error in mem_no_aux (line 58)
bnet2 = learn_params_dbn_em( engine, cases); What version of the product are you using? On what operating system? I am using BNT Full Version 1.07 on 64 bit Windows 7. Please provide any additional information below. What has been provided is one of the sample HMM learning codes - it has been slightly augmented so that there are missing values in the samples for the observed nodes. Substituting a sample with the missing value symbol, [], causes element wise matrix multiplication errors in mult_by_table preventing the program from continuing.

Original issue: http://code.google.com/p/bnt/issues/detail?id=28

Can we draw Junction Tree Graph using BayesNet Toolbox like in Hugin if not can we still print Cliques?

From [email protected] on January 11, 2011 05:55:50

What steps will reproduce the problem? 1. I am using BNT to find marginal probability.
2. But when I use junction tree function it only shows " 1-by-1 " when I call engine while using jtree inference. 3. What is the expected output? What do you see instead? 1-by-1 What version of the product are you using? On what operating system? Matlab 7.1 Ver. and using BNT 1.0.7. Please provide any additional information below. I want to see junction tree graph using BNT like we can draw in Hugin and other Probabilistic software's, if not then at-least Can I see the combined nodes means in the form of Cliques so that I can analyze my simulation that which nodes are giving me higher values or lower.

Thank You so much in advance.

Original issue: http://code.google.com/p/bnt/issues/detail?id=9

for speech recognition, how much will be the number of hidden nodes (Q) when we use an HMM.

From [email protected] on October 20, 2012 05:12:24

i started to work on DBN recently, after that i work on HMM before.

i whould creat a speech recognition system with dbn using BNT.

i understand that each trame will be moddeled by a set of hidden and observed nodes.

the observed nodes size will be the size of the acoustic trame.
but i can't uderstand how i can fix the hidden nodes size!!

i think about using the number of phonems but i am not sure.
can anyone help me.. thanks in advance.

Original issue: http://code.google.com/p/bnt/issues/detail?id=29

learn_params_em() for empty node

Hi,
I'm trying to estimate parameter for a Bayesian Network by function learn_params_em(). The variables in the network are either partially observed, or purely have no observation at all. I found the brief documentation of this function in the link below:
http://bnt.googlecode.com/svn/trunk/docs/usage.html

I see it says learn_params_em() is for conditions "when some values are not observed". My question is when "all values are not observed", will this function return meaningful results?

Thanks,

Using Dirichlet Priors

From [email protected] on October 25, 2011 09:02:18

Hey there,

I am trying to unserstand how to use Dirichlet priors on the parameters of a CPT. In the documentation I can see how to define a dirichlet prior, but it seems to me that it is not possible to specify the Dirichlet pseudo counts separately for each entry.
In my case, I have a prior CPT that represents my prior knowledge, but if I define a dirichlet prior over my parameters, I can only specify the ESS.
The function bayes_update_params will then recompute the pseudo counts by summing the counts present in the data for a particular parent-child combination with the dirichlet prior counts, but neglecting the prior CPT.
If I have that P(X=x | Pa1 = p) with probability 0.9 and a BDeu Dirichlet prior with ESS = 10
I would expect my dirichlet prior to already have a pseudo count of 9 for that particulare state, while instead I get the same number of pseudo counts spread over all the possible states.
The results of this is that my posterior is being relatively flat and drive by the data, despite my prior CPT being sharply peaked around some entries.
What am I doing wrong?

Original issue: http://code.google.com/p/bnt/issues/detail?id=19

Greedy Equivalence Search - discrete value data

Hi,

I wonder if BNT includes the code for Greedy Equivalence Search (GES). I need to run GES algorithm on multiple datasets (~5000 data) that have discrete values. (One GES algorithm in TETRAD is programmed to handle only continuous data). If GES is implemented in BNT, can I use it with multiple datasets once I pull/download BNT and understand how to use?

Regards,
Sanghoon

There might be a bug in jtree_inf_engine and var_elim_inf_engine

Hi,

I am not sure whether it is a bug in junction tree inference and variable elimination.

For the following BN model, when computing marginal for each node given node 1, I have tested Junction tree inference , variable elimination and pear_inf_engine. I found that the results by Junction tree and variable elimination are incorrect while pear_inf_engine is correct. I have no idea what happens in the Junition tree and the variable elimination engine .

I compute the marginal given node 1 for each other node manually. For node 8 and node 9, only pearl_infer_engine gets the right results.

I put the learned bnet model (matlab format) in the attachment.
Model configuration :

  • all nodes are discrete
  • node 1 has 8 states and others are binary

image

## Results : P(A8 = 2 | A1) P(A9 = 2 | A1)
Result : jtree
image

Result : var_elim
image

Result : pearl
image

## *Manual *:
image

code๏ผš
---------------------------ใ€€
jtree_Engine = jtree_inf_engine(bnet) ;
var_Engine = var_elim_inf_engine(bnet) ;
pearl_Engine = pearl_inf_engine(bnet) ;

jtree_conMarg = compMarginal (bnet,jtree_Engine) ;
var_conMarg_ = compMarginal (bnet,var_Engine) ;
pearl_conMarg_ = compMarginal (bnet,pearl_Engine) ;


function [conMarg] = compMarginal (bnet,engine)
%% First node : exp, 8 states, others : AUs, binary
numNodes = size(bnet.dag,1) ;
numAUs = numNodes - 1;
numCls = bnet.node_sizes(1) ;
evidence = cell(1,numNodes) ;
conMarg = zeros(numCls,numAUs) ;
for i = 1 : numCls
evidence{1} = i ;
engine = enter_evidence(engine,evidence) ;
for j = 1 : numAUs
temNodeInd = j + 1 ;
temMarg = marginal_nodes(engine,temNodeInd) ;
conMarg(i,j) = temMarg.T(2) ;
end
end

bnet_learnedModel.zip

Linear Gaussian model with low variance

Hi,

I have an issue with linear Gaussian models which have low variance. If the variance of the linear Gaussian is too low then means of the variables in the joint tend to zero. I have attached a simple MATLAB script showing this:
means_issue.txt

This may be a problem for my application since I have these linear Gaussian models in a dynamic Bayesian network whos time t belief updates after each time step, possibly reducing the variances of multiple nodes.

What is the recommended way to work-around this issue?

DBN Inference: How to compute CPT of nodes in time-slice > 1

Hello Everyone,

I am trying to learn DBN and solve few examples.

In this example, random CPTs are assigned to nodes:
http://www.cs.ubc.ca/~murphyk/Bayes/usage_dbn.may22.html#hmm

Code:
bnet = mk_dbn(intra, inter, ns, dnodes);
for i=1:4
bnet.CPD{i} = tabular_CPD(bnet, i);
end

I am trying to the model the example on this page as DBN:
http://bnt.googlecode.com/svn/trunk/docs/usage.html#basics

  1. The initial CPT for time-slice=1 is given in the example
  2. I am assuming the transitional prob of all hidden nodes to be 0.9, i.e. P(Xt/ X(t-1)) = 0.9

Now my question is how to compute the CPT of C2, S2, R2 for time-slice 2?

S2 here will depend on S1 and C2

image

I would really appreciate any kind of help.

Thanks in advance!

AD

installation fails on Mac MATLAB 2010b?

From [email protected] on July 14, 2011 16:11:36

What steps will reproduce the problem? 1. Just copied bnt into my MATLAB folder (in my path).
2. Ran test_bnt
3. Assertion trips. What is the expected output? What do you see instead? The test_bnt script executing successfully. What I see is:
??? Error using ==> assert at 9
assertion violated:

Error in ==> mk_rooted_tree at 12
assert(isequal(post, post2));

Error in ==> jtree_inf_engine.jtree_inf_engine at 108
[engine.jtree, engine.preorder, engine.postorder] = mk_rooted_tree(engine.jtree, engine.root_clq);

Error in ==> cg1 at 17
engines{end+1} = jtree_inf_engine(bnet);

Error in ==> test_BNT at 5
cg1 What version of the product are you using? On what operating system? I am using BNT 1.0.7 on Mac OS X 10.6 and MATLAB 2010b. Please provide any additional information below.

Original issue: http://code.google.com/p/bnt/issues/detail?id=15

Install issue - Downloading problem

From [email protected] on May 13, 2011 18:10:55

Hey, got the following message. Already changed the permission's folder. Im running win7 and matlab R2008b .

initializing pmtk3
downloading 39 packages to pmtk3/pmtksupportCopy from pmtksupport.googlecode.com - this may take a few minutes
downloading GGM-GWishart.............done
downloading GPstuff-2.0..............done
downloading SPAMS-1.02...............done
downloading boostingDemo.............done
downloading bpca.....................done
downloading dpMixWood................done
downloading dpmixturesTeh07..........done
downloading ekfukf1.2................done
downloading export_fig...............done
downloading exportfig................done
downloading fastICA-2.5..............done
downloading fastfit..................done
downloading gaimc1.0-graphAlgo.......done
downloading glmnet-matlab............Warning: Permission denied to create file "C:\Program
Files\MATLAB\ R2008b \toolbox\probabilistic
toolkit\pmtksupportCopy\glmnet-matlab\glmnetMex.mexw64".

In iofun\private\extractArchive>extractArchiveEntry at 108
In iofun\private\extractArchive at 52
In unzip at 92
In downloadAllSupport at 22
In initPmtk3 at 49
done
downloading gpml-matlab..............done
downloading graphViz4Matlab..........done
downloading l1ls.....................done
downloading lars.....................done
downloading libdai-0.2.6.............done
downloading liblinear-1.51...........done
downloading libsvm-mat-2.9.1.........Warning: Permission denied to create file "C:\Program
Files\MATLAB\ R2008b \toolbox\probabilistic
toolkit\pmtksupportCopy\libsvm-mat-2.9.1\libsvmTrain.mexw64".
In iofun\private\extractArchive>extractArchiveEntry at 108
In iofun\private\extractArchive at 52
In unzip at 92
In downloadAllSupport at 22
In initPmtk3 at 49
done
downloading lightspeed2.3............done
downloading markSchmidt-9march2011...done
downloading matbugs..................done
downloading matlabRlink..............done
downloading maxBranching.............done
downloading mcmcdiag.................done
downloading mplp-1.0.................done
downloading netlab3.3................done
downloading onlineEM.................done
downloading pfColorTracker...........done
downloading pmtkSupportRoot.m........failed to download
downloading randraw..................done
downloading rbpfSlam.................done
downloading readme.txt...............failed to download
downloading rjmcmcRbf................done
downloading sparseBayes2.0...........done
downloading svmLightWindows..........done
downloading vblinlogreg..............done
??? Undefined function or variable 'pmtkSupportRoot'.

Error in ==> installLightspeedPMTK at 19
directory = fullfile(pmtkSupportRoot, getConfigValue('PMTKlightSpeedDir'));

Error in ==> initPmtk3 at 178
installLightspeedPMTK();

Regards

Original issue: http://code.google.com/p/bnt/issues/detail?id=11

Suspected bug wrt node sizes in potentials

From [email protected] on March 21, 2012 15:59:37

First, let me thank you for providing and taking care for this great toolkit! I've been using it for a while and it opened lots of new possibilities for my research! What steps will reproduce the problem? I have attached an .m file which reproduces the error. Note that the file is based on the Mixture-of-Experts example provided on this website. I changed the network to have dimension 2 in the node denoted by X. The error occurs in enter_evidence when assuming only X to be observed as evidence. If any other node is observed, the error does not occur. Also, the model can be learned without error. I added the comment "fails" to the respective lne in the script. What is the expected output? What do you see instead? Error using +
Matrix dimensions must agree.

Error in cpot/multiply_by_pot (line 11)
bigpot.h(u) = bigpot.h(u) + smallpot.h;

Error in cgpot/multiply_by_pot (line 18)
bigpot.can{i} = multiply_by_pot(bigpot.can{i}, smallpot.can{src});

Error in jtree_inf_engine/init_pot (line 17)
clpot{c} = multiply_by_pot(clpot{c}, pots{i});

Error in jtree_inf_engine/enter_evidence (line 77)
[clpot, seppot] = init_pot(engine, clqs, pot, pot_type, onodes);

Error in TEST (line 36)
engine = enter_evidence(engine, {[-0.31; 0.1]; []; []});% fails What version of the product are you using? On what operating system? latest copmment in changelog is "7 May 2010 wsun", so I think it's the latest version of BNT; Matlab 2011b, Windows 7 Please provide any additional information below. I am not very eductated wrt Bayesian networks, but I still have a suspicion what could be the "bug". IN multiply_by_pot, smalpot and bigpot are combined, where in bigpot, the observed continuous node's size has been set to 0 (see also bnt help on this page), while smallpot (the same observed ct. node) has size 2.

Attachment: TEST.m

Original issue: http://code.google.com/p/bnt/issues/detail?id=25

general BP inference engine (belprop_inf_engine) not working

From [email protected] on July 31, 2010 18:15:55

The general BP inference (belprop_inf_engine) engine for bayesian networks does not compute marginals properly on even the simplest networks. The Pearl BP inference engine works flawlessly, but the general BP engine for bayesian networks does not. The general BP inference engine for factor graphs works properly also.

Thanks,

Jason

Original issue: http://code.google.com/p/bnt/issues/detail?id=6

DBN - Inference error on the example HMM problem

From [email protected] on June 27, 2011 01:04:29

Hi,

I have recently started learning the BayesNet toolbox and I faced some problems with running test experiments. I hope someone could help me.

What steps will reproduce the problem?

I run the below script to test the inference algorithm on DBNs using the sample codes from the online tutorial for a simple HMM:


clc
clear

intra = zeros(2);
intra(1,2) = 1; % node 1 in slice t connects to node 2 in slice t

inter = zeros(2);
inter(1,1) = 1; % node 1 in slice t-1 connects to node 1 in slice t

Q = 2; % num hidden states
O = 2; % num observable symbols

ns = [Q O];
dnodes = 1:2;
onodes = [2];

bnet = mk_dbn(intra, inter, ns, 'discrete', dnodes);

% Create CPDs
for i=1:3
bnet.CPD{i} = tabular_CPD(bnet, i);
end

% Create the engine
engine = smoother_engine(jtree_2TBN_inf_engine(bnet));

% Add evidence
T = 10;
ss = 2;
ev = sample_dbn(bnet, T);
evidence = cell(ss,T);
evidence(onodes,:) = ev(onodes, :); % all cells besides onodes are empty
engine = enter_evidence(engine, evidence);

% Compute the marginal
t = 5;
nodes = [1 2];

m = marginal_nodes(engine, nodes, t);

What is the expected output? What do you see instead?

I get the following error when I run the script:


??? Error using ==> times
Matrix dimensions must agree.

Error in ==> mult_by_table at 7
bigT(:) = bigT(:) .* Ts(:); % must have bigT(:) on LHS to preserve shape

Error in ==> dpot.multiply_by_pot at 11
Tbig.T = mult_by_table(Tbig.T, Tbig.domain, Tbig.sizes, Tsmall.T, Tsmall.domain, Tsmall.sizes);

Error in ==> jtree_inf_engine.init_pot at 17
clpot{c} = multiply_by_pot(clpot{c}, pots{i});

Error in ==> jtree_2TBN_inf_engine.fwd1 at 20
[f.clpot, f.seppot] = init_pot(engine.jtree_engine1, CPDclqs, CPDpot, engine.pot_type,
engine.observed1);

Error in ==> smoother_engine.enter_evidence at 12

[f{1}, ll(1)] = fwd1(engine.tbn_engine, ev(:,1), 1);

What version of the product are you using? On what operating system?

I have installed the FullBNT-1.0.7 version.

Please provide any additional information below.

I am using Matlab 7.7.0

Original issue: http://code.google.com/p/bnt/issues/detail?id=13

example code on DBN

From [email protected] on November 14, 2011 18:53:50

I am trying to do inference on a HMM model (coupled) that has loops, and I want to use the loopy belief propagation method.

I noticed that there are two functions:
pearl_unrolled_dbn_inf_engine
pearl_dbn_inf_engine

It seems that the later one is not work yet. Is there any difference between the two? Where can I find more information?

Your response is greatly appreciated.

I am using the FullBNT 1.0.7 version

Original issue: http://code.google.com/p/bnt/issues/detail?id=20

Parallel Computing Toolbox crashes

From [email protected] on October 22, 2012 14:13:05

What steps will reproduce the problem? 1. install bnt on Matlab 2012a with Parallel Computing Toolbox
2. In Matlab, go to Parallel menu, "Select Cluster Profile" 3. What is the expected output? What do you see instead? Normally, you will see the profile "local." Instead, you will get a Java exception. What version of the product are you using? On what operating system? Matlab 2012a on Win7 Enterprise 64 bit Please provide any additional information below. Workaround: after installing bnt, move bnt files in the matlab path to the end of the path listing. test_BNT should still work.

Original issue: http://code.google.com/p/bnt/issues/detail?id=30

missing variable in fprintf in log_lik_complete.m

From [email protected] on March 30, 2010 12:48:47

What steps will reproduce the problem? 1. running log_lik_complete on a bnet with very low likelihoods What is the expected output? What do you see instead? Expected:
...
node 4 has very low likelihood
node 5 has very low likelihood
node 8 has very low likelihood
...
I see:
node node node node... What version of the product are you using? On what operating system? FullBNT-1.0.4 on Ubuntu linux Please provide any additional information below. To fix this, change line 27 in log_lik_complete.m from
if approxeq(exp(ll), 0), fprintf('node %d has very low likelihood\n'); end

to

if approxeq(exp(ll), 0), fprintf('node %d has very low likelihood\n', i);
end

Original issue: http://code.google.com/p/bnt/issues/detail?id=2

Wrong assert in HMM/fwdback.m

Hi,
the function fwdback in HMM/fwdback.m has an option "scaled" to scale alpha and beta probabilities to avoid numerical problems with long sequences. If this option is set to false, alpha(h,t) is the probability of observing the sequence up to time t and being in state h at time t. The sum of those probabilities with respect to the states (sum(alpha(:,t))) is not supposed to be 1, but rather the probability (or likelihood) of the observation sequence up to time t. Consequently the assert statements at line 105 and 125 fail always if running with scaled=false.

The assert statements should either be removed, or moved into the "if scaled" statements (however, they are not necessary there, given the call to normalise).

Best
Giampiero

creating replicating binary tree

From [email protected] on December 09, 2011 09:12:17

I'm curious whether it is possible to create a replicating binary tree using BNT.

to be exact: V(n+1)= 2_V(n) or
= 0.5_V(n)
where the probability of each event is (say) 50%.
with an arbitrary number of steps n (so the complete state space is unknown) What version of the product are you using? On what operating system? the most recent version, on matlab 2011b, windows 7

Thanks in advance.

Original issue: http://code.google.com/p/bnt/issues/detail?id=21

enter_evidence for decision networks- LIMID??

I tried to use the function enter_evidence to enter findings into my net and to recalculate the Expected Utility but I couldn't do so!
So my question is: Is it possible to use this function ( enter_evidence ) to enter findings in LIMIDs ?
Thank you!!

Patch for 'jtree_2TBN_inf_engine' to support partial observability

From [email protected] on December 09, 2010 17:17:01

jtree_2TBN_inf_engine can't handle observed values for a node that you said was hidden. This is a problem when training the DBN, as often you /do/ know the hidden node value and you'd like the algorithm to account for this.

In message #1906 on the mailing list, "Bob" submitted a patch. Please review it and apply it to the codebase. Patches attached.

See: http://tech.dir.groups.yahoo.com/group/BayesNetToolbox/message/1906

Attachment: fwd.patch fwd1.patch

Original issue: http://code.google.com/p/bnt/issues/detail?id=8

inference error

From [email protected] on September 19, 2011 12:54:47

hi
Email Address:::: [email protected]

help me ; help me plz I write code for Inference but it have error below:::

??? Error using ==> subsindex Function 'subsindex' is not defined for values of class 'cell'.

Error in ==> discrete_CPD.convert_to_table at 14 T = CPT(index{:});

Error in ==> discrete_CPD.convert_to_pot at 20

T = convert_to_table(CPD, domain, evidence);

Error in ==> jtree_inf_engine.enter_evidence at 57

pot{n} = convert_to_pot(bnet.CPD{e}, pot_type, fam(:), evidence);

Error in ==> Rafe_inference at 116 engine,loglik? = enter_evidence(engine,evidence);

my code is :::::

clear all

clc

A?=xlsread('E:\DATA MINING\final_cut.xlsx');

N=6;

dag=zeros(N,N);

AS=1;AM=2;CC=3;SC=4;VT=5;DA=6;

dag(3:6,AS)=1;dag(2,3)=1;dag(3,4)=1;

discrete_nodes=1:N;

node_sizes=3 9 9 21 10?;

onodes=2:6;

bnet=mk_bnet(dag,node_sizes,'observed',onodes);

draw_graph(bnet.dag);

bnet.CPD{AS}=tabular_CPD(bnet,AS);

bnet.CPD{AM}=tabular_CPD(bnet,AM);

bnet.CPD{CC}=tabular_CPD(bnet,CC);

bnet.CPD{SC}=tabular_CPD(bnet,SC);

bnet.CPD{VT}=tabular_CPD(bnet,VT);

bnet.CPD{DA}=tabular_CPD(bnet,DA);

TrainingSamples?=cell(N,size(A,1));

for i = 1 : size(A,1)

TrainingSamples?(1,i)={A(i,1)'};
TrainingSamples?(2,i)={A(i,2)'};
TrainingSamples?(3,i)={A(i,3)'};
TrainingSamples?(4,i)={A(i,4)'};
TrainingSamples?(5,i)={A(i,5)'};
TrainingSamples?(6,i)={A(i,6)'};
end

bnet=learn_params(bnet,TrainingSamples?);

engine = jtree_inf_engine(bnet);

evidence = cell(1,N);

evidence{AM} = {A(i,2)'};

evidence{CC} = {A(i,3)'};

evidence{SC} = {A(i,4)'};

evidence{VT} = {A(i,5)'};

evidence{DA} = {A(i,6)'};

engine,loglik? = enter_evidence(engine,evidence);

marg = marginal_nodes(engine, AS);

Attachment: Rafe_inference.m final_cut.xlsx

Original issue: http://code.google.com/p/bnt/issues/detail?id=17

mk_bnet is not working....please fix it.

From [email protected] on March 12, 2013 08:20:05

What steps will reproduce the problem? 1. Run the basic examples of Bayes net from http://bnt.googlecode.com/svn/trunk/docs/usage.html 2. 3. What is the expected output? What do you see instead? mk_bnet should work without error. What version of the product are you using? On what operating system? 1.0.7 Please provide any additional information below. bnet=mk_bnet(dag, node_sizes,'discrete',[2 2 2 2]);
Undefined function 'mysetdiff' for input arguments of type 'double'.

Then I changed mysetdiff to builtin setdiff()

bnet = mk_bnet(dag, node_sizes, 'discrete', discrete_nodes);
Undefined function 'parents' for input arguments of type 'double'

Original issue: http://code.google.com/p/bnt/issues/detail?id=31

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.