lmhtools's People
lmhtools's Issues
dictionary does not find "Agorithmus"
notes.tex
eventually includes /Users/kohlhase/localmh/MathHub/MiKoMH/GenICT/source/programming/en/program-execution.tex
, which has
\begin{definition}[display=flow,id=algorithm.def]
\Defi{algorithm}: informal description of what to do (good enough for
humans)
\end{definition}
and indeed, there is an english entry for algorithm
, but none for Algorithmus
, even though smglom/IWGS/source/algorithm.de.tex
exists and has
Ein \defi[name=algorithm]{Algorithmus} ist eine formale oder informelle Spezifikation
glossary generation and nested modules
in MiKoMH/GenCS/source/adt/en/adt-def.tex
we have the nested module
\item
\begin{module}[id=adt-Bool]
\symdef{BoolSort}{{{\mathbb{B}}}}
\symdef{BoolTrueConst}{T}
\symdef{BoolFalseConst}{F}
\symdef{BoolADT}{\cB}
\begin{example}[id=bool-adt,for=abstract-data-type]
\inlinedef[for={Boolsort,BoolTrueConst,BoolFalseConst}]
{$\defeq\BoolADT{\adt{\set{\BoolSort}}{\set{\consdecl\BoolTrueConst\BoolSort,
\consdecl\BoolFalseConst\BoolSort}}}$} is an abstract data for
\defiis{truth}{value}.
\end{example}
\end{module}
which generates
\begin{smentry}{\hypertarget{x43d933a4aadfaf62}{truth value}}{MiKoMH/GenCS}
\usemhmodule[mhrepos=MiKoMH/GenCS,path=adt/en/adt-def]{adt-def}
\inlinedef[for={Boolsort,BoolTrueConst,BoolFalseConst}]
{$\defeq\BoolADT{\adt{\set{\BoolSort}}{\set{\consdecl\BoolTrueConst\BoolSort,
\consdecl\BoolFalseConst\BoolSort}}}$} is an abstract data for
\defiis{truth}{value}.
\end{smentry}
Note that the \usemodule
points to adt-def
as a used module, but it should be
\usemhmodule[mhrepos=MiKoMH/GenCS,path=adt/en/adt-def]{adt-Bool}
more elegant table
I think instead of the current
\begin{longtable}{p{0.495\textwidth}p{0.495\textwidth}}
\textbf{German}&\textbf{English}\\
\hline
\selectlanguage{german}Ausgabesystem & \selectlanguage{english}output subsystem\\
We could use (>{...}
inserts a macro into the following column)
\begin{longtable}{>{\selectlanguage{german}}p{0.495\textwidth}>{\selectlanguage{English}}p{0.495\textwidth}}
\textbf{Deutsch}&\textbf{English}\\\hline
Ausgabesystem & output subsystem\\
Not essential, but more elegant.
new glossary no longer compiles
because \lstinputmhlisting
no longer works and I bet \mhgraphics
does not either.
The problem is that the new macros no longer set the current module. I remember that this worked at some time, but was discarded. We probably need another argument to the glossary entry environment.
duplicates in glossary generation.
I get
\smjointdefref{plan}{x4fa0d953ba1a38a0}{state space}
\smjointdefref{plan}{x4fa0d953ba1a38a0}{state space}
in the glossary for AI. They even have the same hyperref.
lemmata with formulae need \usemhmodule (sometimes)
I am starting to generate a glossary for AI. And I get
\begin{smentry}{\hypertarget{x53168c7dca632656}{\guse[MiKoMH/AI]{a-star}$\protect\astarSearch$ search}}{MiKoMH/AI}
\usemhmodule[repos=MiKoMH/AI,path=search/en/a-star]{a-star}
in glossary.en.tex
. This has an error: In line 1, \guse[MiKoMH/AI]{a-star}
should be
\usemhmodule[repos=MiKoMH/AI,path=search/en/a-star]{a-star}
, i.e. just the same as line 3.
That should always work.
hyperreferen between dictionary to glossary
It would be good, if we could include \item[\hypertarget{#1}{#1}] into the glossary entry. Then we could generate
\hyperref{AI}{dictionary.pdf#AI}` to have a cross-reference.
glossary drill cards
With the glossary, we can now make drill cards (essentially glossary entries where the lemma is on the one side of a card and the definition on the other). These could printed for drilling.
generate a LaTeX Glossary and Dictionary
@jukkakohonen suggested the following (my paraphrase):
The lmhtools
can already parse a lot of information out of smglom. In particular,
- glossary information, i.e. which words are defined in which definitions in a given language.
- dictionary information, i.e. which words are translations of each other (cross-language) and which languages (same-language) are synonyms of each other.
We should use this to generate (as a big sTeX/LaTeX file each)
- for each language a math glossary, i.e. an alphabetic list of lemma/definition pairs, e.g. as in https://en.wikipedia.org/wiki/Glossary_of_computer_science
- for each set of languages math dictionary, i.e. an alphabetic table of translations (synset:l_1 & synset:l_2 & ...). where synset:l_i is the set of synonyms of a given language
This these should only a python script that uses the tools we already have away. And it would be very motivating to outsiders to have these to contribute.
the AI graph only has "uses" edges at the moment.
I just generated it, and the imports are missing.
glossary generation over-generates
I generate the AI glossary MathHub/MiKoMH/AI/source/course/notes/en.glossary.tex
with make -B gloss
and I find the following spurious entry in it:
\begin{smentry}{\hypertarget{x1d02e3243c79d38c}{list constructor}}{MiKoMH/GenICT}
\usemhmodule[mhrepos=MiKoMH/GenICT,path=python/en/lists]{python-lists}
We call \lstinline[mathescape]|[$\pmetavar{seq}$]| the \defii{list}{constructor}.
\end{smentry}
I am not sure how this can come in, especially, since, in notes.log
, there is no trace of loading
MiKoMH/GenICT/source/python/en/lists.tex
.
standalone glossary
The glossary should a standalone prefix and postfix, i.e. it should be of the form
\documentclass[mh,notes]{mikoslides}
\libinput{preamble}
\begin{document}
and then the regular stuff and then
\end{document}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:
that would make it much simpler to debug.
There are spurious blue nodes in the IWGS graph
The IWGS graph contains the node "simple-type.tex", which is never shown the course (but of course included). simple-types.tex should be green.
german glossary for IWGS
please.
error message when MATHHUB variable is not set.
It seems that this is not reliably done.
import-graph output grahviz
I would like to have a way of outputting the course graph in to graphviz form. Best, with the colors intact. I.e. the output format (.dot
for graphviz and .json
for TGView3D should be choosable by an option or so.
I would also like to have a variant of the graph for a chapter in a course, i.e. where only the contribution of that chapter is highlighted. Having that in PDF (i.e. via Graphviz) would allow me to put this into the beginning of each chapter.
We probably have to discuss what I really want here.
Oh, and we need #23 fixed before this makes any sense.
give make lecture-glossary.py an option for "indexing definitions that are already in the document"
I just discovered that the --big
option is only half of what I want: making documents referentially complete by adding the glossary in an appendix. In particular, I do not want the definitions in the document be duplicate.
So I need an option that treats definienda from inside the document in an index-like fashion: if we have a \defi{foo}
in a module bar
, then I want to have generated an index-like entry:
\smindex{foo}{bar}
which will expand to something like
\item[foo] see the \href{definition}{foo@bar} on page ??
That will give all the \trefi
in the document a unique place to point to.
references from dictionary and glossary into notes.pdf
It woudl be nice to have references from the glossary and the dictionary back to notes.pdf.
I initially thought that we could use the entries for all symbols foo?bar in notes.idx
. e.g.
\indexentry{semantics}{33}
\indexentry{composition!principle}{33}
but this does not work, since we do not have the module names.
I guess i have to generate another file where I have somehing like
\symbolindex{foo}{bar}{33}
probably in a separate file notes.sidx
. Once we have that, it should be relatively easy to add this information to the glossary and the dictionary.
\usemhmodule and \guse should generate type "uses"
There is currently a "include cycle" in the IWGS graph between editor.tex
and wordprocessors.tex
but the edge between wordprocessors.tex
and editor.tex
should be a uses
link. I suspect this is a general problem
I hope that fixing this will reduce cycles and give much better layout
dictionary and glossary have custom driver files
I would like to restructure the dictionary and glossary generation.
Instead of generating a full LaTeX file glossary.tex
, we should only generate lists glossary-en.tex
to be included in a "glossary-driver file". Then I can write glossary driver files of the form
\documentclass{article}
\title...
\begin{document}
\maketitle
\begin{abstract}....\end{abstract}
\section{Introduction}
blabla
\section{The Glossary in Englisch}
\input{glossary-en}
\end{document}
And the generated file does not have to commit on the macros it currently includes. And these environments can just be adapted in the driver file.
new form of import/usemhmodule
I have introduced \importmhmoudle[dir=foo]{bar}
as an abbreviation for \importmhmodule[path=foo/bar]{bar}
and the same for \usemhmodule
.
And generally, repos= is now mhrepos=.
I have change the latter in make_glossary.py
, but I suspect I have not found everything.
NOTE: the old form is also still valid.
regression: glossary creation broken for IWGS (and AI)
can be verified by a simple make gloss
in .../MathHub/MiKoMH/IWGS/source/course/notes
(after uncommenting the #en
in the Makefile
there.
IWGS glossary and dictionary.
I would like to have a glossary and dictionary like in #6 for the IWGS course, which I am currently multilingualizing.
Resources:
- IWGS sources are in https://gl.mathhub.info/MiKoMH/IWGS/ and neighboring archives.
- An IWGS glossary has been started in https://gl.mathhub.info/smglom/IWGS/
There are two ways glossary information enters the course.
- Once directly via the IWGS glossary; see https://gl.mathhub.info/MiKoMH/GenICT/blob/master/source/programming/en/hardware-software-programming.tex for an example.
- The other is by multilingual modules: see https://gl.mathhub.info/MiKoMH/IWGS/blob/master/source/progintro/en/proglang-synsem.en.tex and https://gl.mathhub.info/MiKoMH/IWGS/blob/master/source/progintro/en/proglang-synsem.de.tex (I am not quite sure whether I want to put the latter into /de/ in the future. But let's just stick with this for now (it works).
lecture-glossary.py has regressed
it seems that it has not followed the recent change to \trefi[kb-query?fail]{failure}' (e.g.
\mtrefi` does not exist any more).
It would be great to have this back.
{example} for \inlinedef as well.
one more possible environment.
tgview shows uses links as undirected
I wonder why.
expected mtrefi should be amended for drefi
I am getting the error message
/Users/kohlhase/localmh/MathHub/MiKoMH/IWGS/source/digdocs/en/trees-cs.tex:11:27: Expected mtrefi for '\drefi[branch?branch]{branches}'
but the \drefi[tree]{root}, \drefi[branch?branch]{branches}, \drefi[tree?leaf]{leaves},
is completely legitimate.
The error message should not be generated in this case.
unbalanced content generated by lecture-glossary.py
In MathHub/MiKoMH/LBS/source/course/notes/Makefile
I generate a file en.glossary.tex
using lecture-glossary.py
(make gloss
). But this generates non-well-bracketed content (just run it) which breaks automated make this is very annoying and delays LBS course notes generation.
It would be great, if you could fix it.
inlinedef makes unbalanced LaTeX.
glossary generation and drefi (broken and underspecified)
the glossary generation tools do not work as (I) expected. And I think that the behavior is still not sufficiently specified, so I will try here.
Generally in a situation like
\begin{module}[id=FOO]
\importmodule{foo}
\begin{definition}[id=def.FOOBAR]
A \defi{bar} is a sdfsdfsdfsdfsdfsdfsdf...
\end{definition}
...
\end{module}
the \drefi[foo?bar}{BAR}
references a symbol with name foo?bar
(i.e. one name=foo
from the imported module module bar
) and creates a new symbol FOO?BAR
, which is a "re-definition" or "recap" of foo?bar
(whatever that really means).
In particular, both of them are visible whenever module FOO
is imported. For the glossary tools this means that both foo?bar
and FOO?BAR
should be found, if the course contains the snippet above.
For the following I will assume that theory foo
is from the SMGloM
and is of the form
\begin{module}[id=foo]
\begin{definition}[id=def.foobar]
A \drefi[foo?bar}{BAR} is a SDFSFSDFSDFSDF
\end{definition}
...
\end{module}
BUT it is not clear what should be in the generated glossary. There are a couple of possibilities:
- local: only
- BAR: A BAR is a SDFSDFSDSDFSF...
, i.e. the lemmaBAR
and the content of the definitiondef.FOOBAR
; after all that is all that is really in the course. - source: only
- BAR: a bar is a sdfsdfsfsdfsdfsdfsdf...
, i.e. the lemmaBAR
and the content of the definitiondef.foobar
; after all,def.foobar
is just an (often abbreviated in courses) recap ofdef.FOOBAR
(andbar
andBAR
are usually identical). - both:
- BAR (see "bar"): A BAR is a SDFSDFSDSDFSF...
and- BAR: a bar is a sdfsdfsfsdfsdfsdfsdf...
; that may be very duplicating.
And then there is the question where we want to specify the behavior:
- in the glossary generation program? e.g. by a switch
--drefi=local/source/both
for the three possibilities above. - in the
\drefi
or the enclosingdefinition
, after all there are multiple paradigmatic uses of\drefi
- the local definition is a literal copy of the source definition, then
source
andlocal
are identical andboth
makes no sense. Here sTeX should probably have some way of inputting the respective definition from SMGloM. - the local
definition
is an abbreviation or reformulation of the source definition (but the lemmatabar
andBAR
are identical), e.g. the local definition is in the form of an\inlinedef
, then probably thesource
behavior above is called for. - the local
definition
establishes a synonymBAR
forbar
, then theboth
behavior is appropriate.
- the local definition is a literal copy of the source definition, then
It seems the latter gives more control: we could use a variant \drefi*
for i. and ii. - their effect on the glossary is the same- and \drefi
for the iii. The program option --drefi=?
might set a useful default behavior if no behavior is specified on the local drefis.
make lecture-glossary.py fully recursive
It would be good if this had a mode, where the glossary is downward closed, i.e. all \trefi
references are in the document.
lmhtools2/concept_graph.py has regressed
It does not seem to work any more on LBS, to reproduce change GRAPH=
to GRAPH=graph.json
in the Makefile
and run make
.
rename the glossary presentation macros.
I have moved the glossary presentation macros into smglom.sty
, this makes dependency struture simpler. I would like to rename them:
smentry
instead ofentry
\smsynonymref
instead of\synonmymref
\smjointdefref
instead of\seeref
generated IWGS glossary contains duplicates
for instance primitive
avoid duplication in glossary
instead of printing the definition body a second time, we should just do the equivalent of
\item[bar]\hypertarget{bar} sdfsdfsdfsdfsdfsdfsdfsdfsdfsdfdsf
\item [foo] see \href{#bar}{bar}
or similar
file not found error
for some reason I get an error when generating the AI graph
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/ml-observations-outline.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/ml-forms.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/inductive-learning.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/ml-decision-trees.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/information-theory.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/evaluation.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/learning-scale.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/clt.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/linear-regression.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/neural-networks.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/support-vector-machines.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/ml-observations-summary.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/ml-xkcd.tex'
But the files are there, and LaTeX finds them. It seems that this comes when reading AI/source/ml/fragments/ml-observations.tex
maybe the reason is that this is the last file before the \covereduptohere[1]
?
meta-inf/applications does not generate dictionary files
and I do not know why. *.{en,ro,de,zhs}.tex are missing.
BTW, I have upated the driver files significantly. We now have to run bibTeX on all of them.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.