Giter Site home page Giter Site logo

lmhtools's People

Contributors

jfschaefer avatar kohlhase avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

lmhtools's Issues

dictionary does not find "Agorithmus"

notes.tex eventually includes /Users/kohlhase/localmh/MathHub/MiKoMH/GenICT/source/programming/en/program-execution.tex, which has

    \begin{definition}[display=flow,id=algorithm.def]
      \Defi{algorithm}: informal description of what to do (good enough for
      humans)
    \end{definition}

and indeed, there is an english entry for algorithm, but none for Algorithmus, even though smglom/IWGS/source/algorithm.de.tex exists and has

 Ein \defi[name=algorithm]{Algorithmus} ist eine formale oder informelle Spezifikation

glossary generation and nested modules

in MiKoMH/GenCS/source/adt/en/adt-def.tex we have the nested module

\item
    \begin{module}[id=adt-Bool]
      \symdef{BoolSort}{{{\mathbb{B}}}}
      \symdef{BoolTrueConst}{T}
      \symdef{BoolFalseConst}{F}
      \symdef{BoolADT}{\cB}
    \begin{example}[id=bool-adt,for=abstract-data-type]
      \inlinedef[for={Boolsort,BoolTrueConst,BoolFalseConst}]
      {$\defeq\BoolADT{\adt{\set{\BoolSort}}{\set{\consdecl\BoolTrueConst\BoolSort,
            \consdecl\BoolFalseConst\BoolSort}}}$} is an abstract data for
      \defiis{truth}{value}.
    \end{example}
  \end{module}

which generates

\begin{smentry}{\hypertarget{x43d933a4aadfaf62}{truth value}}{MiKoMH/GenCS}
\usemhmodule[mhrepos=MiKoMH/GenCS,path=adt/en/adt-def]{adt-def}
\inlinedef[for={Boolsort,BoolTrueConst,BoolFalseConst}]
      {$\defeq\BoolADT{\adt{\set{\BoolSort}}{\set{\consdecl\BoolTrueConst\BoolSort,
            \consdecl\BoolFalseConst\BoolSort}}}$} is an abstract data for
      \defiis{truth}{value}.
\end{smentry}

Note that the \usemodule points to adt-def as a used module, but it should be

\usemhmodule[mhrepos=MiKoMH/GenCS,path=adt/en/adt-def]{adt-Bool}

more elegant table

I think instead of the current

\begin{longtable}{p{0.495\textwidth}p{0.495\textwidth}}
\textbf{German}&\textbf{English}\\
\hline
\selectlanguage{german}Ausgabesystem & \selectlanguage{english}output subsystem\\

We could use (>{...} inserts a macro into the following column)

\begin{longtable}{>{\selectlanguage{german}}p{0.495\textwidth}>{\selectlanguage{English}}p{0.495\textwidth}}
\textbf{Deutsch}&\textbf{English}\\\hline
Ausgabesystem & output subsystem\\

Not essential, but more elegant.

new glossary no longer compiles

because \lstinputmhlisting no longer works and I bet \mhgraphics does not either.
The problem is that the new macros no longer set the current module. I remember that this worked at some time, but was discarded. We probably need another argument to the glossary entry environment.

duplicates in glossary generation.

I get

\smjointdefref{plan}{x4fa0d953ba1a38a0}{state space}
\smjointdefref{plan}{x4fa0d953ba1a38a0}{state space}

in the glossary for AI. They even have the same hyperref.

lemmata with formulae need \usemhmodule (sometimes)

I am starting to generate a glossary for AI. And I get

\begin{smentry}{\hypertarget{x53168c7dca632656}{\guse[MiKoMH/AI]{a-star}$\protect\astarSearch$ search}}{MiKoMH/AI}
\usemhmodule[repos=MiKoMH/AI,path=search/en/a-star]{a-star}

in glossary.en.tex. This has an error: In line 1, \guse[MiKoMH/AI]{a-star} should be
\usemhmodule[repos=MiKoMH/AI,path=search/en/a-star]{a-star}, i.e. just the same as line 3.
That should always work.

hyperreferen between dictionary to glossary

It would be good, if we could include \item[\hypertarget{#1}{#1}] into the glossary entry. Then we could generate \hyperref{AI}{dictionary.pdf#AI}` to have a cross-reference.

glossary drill cards

With the glossary, we can now make drill cards (essentially glossary entries where the lemma is on the one side of a card and the definition on the other). These could printed for drilling.

generate a LaTeX Glossary and Dictionary

@jukkakohonen suggested the following (my paraphrase):
The lmhtools can already parse a lot of information out of smglom. In particular,

  • glossary information, i.e. which words are defined in which definitions in a given language.
  • dictionary information, i.e. which words are translations of each other (cross-language) and which languages (same-language) are synonyms of each other.

We should use this to generate (as a big sTeX/LaTeX file each)

  • for each language a math glossary, i.e. an alphabetic list of lemma/definition pairs, e.g. as in https://en.wikipedia.org/wiki/Glossary_of_computer_science
  • for each set of languages math dictionary, i.e. an alphabetic table of translations (synset:l_1 & synset:l_2 & ...). where synset:l_i is the set of synonyms of a given language

This these should only a python script that uses the tools we already have away. And it would be very motivating to outsiders to have these to contribute.

glossary generation over-generates

I generate the AI glossary MathHub/MiKoMH/AI/source/course/notes/en.glossary.tex with make -B gloss and I find the following spurious entry in it:

\begin{smentry}{\hypertarget{x1d02e3243c79d38c}{list constructor}}{MiKoMH/GenICT}
\usemhmodule[mhrepos=MiKoMH/GenICT,path=python/en/lists]{python-lists}
We call \lstinline[mathescape]|[$\pmetavar{seq}$]| the \defii{list}{constructor}.
\end{smentry}

I am not sure how this can come in, especially, since, in notes.log, there is no trace of loading
MiKoMH/GenICT/source/python/en/lists.tex.

standalone glossary

The glossary should a standalone prefix and postfix, i.e. it should be of the form

\documentclass[mh,notes]{mikoslides}
\libinput{preamble}
\begin{document}

and then the regular stuff and then

\end{document}

%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:

that would make it much simpler to debug.

import-graph output grahviz

I would like to have a way of outputting the course graph in to graphviz form. Best, with the colors intact. I.e. the output format (.dot for graphviz and .json for TGView3D should be choosable by an option or so.

I would also like to have a variant of the graph for a chapter in a course, i.e. where only the contribution of that chapter is highlighted. Having that in PDF (i.e. via Graphviz) would allow me to put this into the beginning of each chapter.

We probably have to discuss what I really want here.

Oh, and we need #23 fixed before this makes any sense.

give make lecture-glossary.py an option for "indexing definitions that are already in the document"

I just discovered that the --big option is only half of what I want: making documents referentially complete by adding the glossary in an appendix. In particular, I do not want the definitions in the document be duplicate.

So I need an option that treats definienda from inside the document in an index-like fashion: if we have a \defi{foo} in a module bar, then I want to have generated an index-like entry:

\smindex{foo}{bar}

which will expand to something like

\item[foo] see the \href{definition}{foo@bar} on page ??

That will give all the \trefi in the document a unique place to point to.

references from dictionary and glossary into notes.pdf

It woudl be nice to have references from the glossary and the dictionary back to notes.pdf.

I initially thought that we could use the entries for all symbols foo?bar in notes.idx. e.g.

\indexentry{semantics}{33}
\indexentry{composition!principle}{33}

but this does not work, since we do not have the module names.
I guess i have to generate another file where I have somehing like

\symbolindex{foo}{bar}{33}

probably in a separate file notes.sidx. Once we have that, it should be relatively easy to add this information to the glossary and the dictionary.

\usemhmodule and \guse should generate type "uses"

There is currently a "include cycle" in the IWGS graph between editor.tex and wordprocessors.tex but the edge between wordprocessors.tex and editor.tex should be a uses link. I suspect this is a general problem

I hope that fixing this will reduce cycles and give much better layout

dictionary and glossary have custom driver files

I would like to restructure the dictionary and glossary generation.
Instead of generating a full LaTeX file glossary.tex, we should only generate lists glossary-en.tex to be included in a "glossary-driver file". Then I can write glossary driver files of the form

\documentclass{article}
\title...
\begin{document}
\maketitle
\begin{abstract}....\end{abstract}
\section{Introduction}
blabla
\section{The Glossary in Englisch}
\input{glossary-en}
\end{document}

And the generated file does not have to commit on the macros it currently includes. And these environments can just be adapted in the driver file.

new form of import/usemhmodule

I have introduced \importmhmoudle[dir=foo]{bar} as an abbreviation for \importmhmodule[path=foo/bar]{bar} and the same for \usemhmodule.
And generally, repos= is now mhrepos=.

I have change the latter in make_glossary.py, but I suspect I have not found everything.

NOTE: the old form is also still valid.

IWGS glossary and dictionary.

I would like to have a glossary and dictionary like in #6 for the IWGS course, which I am currently multilingualizing.

Resources:

There are two ways glossary information enters the course.

lecture-glossary.py has regressed

it seems that it has not followed the recent change to \trefi[kb-query?fail]{failure}' (e.g. \mtrefi` does not exist any more).
It would be great to have this back.

expected mtrefi should be amended for drefi

I am getting the error message

/Users/kohlhase/localmh/MathHub/MiKoMH/IWGS/source/digdocs/en/trees-cs.tex:11:27: Expected mtrefi for '\drefi[branch?branch]{branches}'

but the \drefi[tree]{root}, \drefi[branch?branch]{branches}, \drefi[tree?leaf]{leaves}, is completely legitimate.
The error message should not be generated in this case.

unbalanced content generated by lecture-glossary.py

In MathHub/MiKoMH/LBS/source/course/notes/Makefile I generate a file en.glossary.tex using lecture-glossary.py (make gloss). But this generates non-well-bracketed content (just run it) which breaks automated make this is very annoying and delays LBS course notes generation.
It would be great, if you could fix it.

glossary generation and drefi (broken and underspecified)

the glossary generation tools do not work as (I) expected. And I think that the behavior is still not sufficiently specified, so I will try here.

Generally in a situation like

\begin{module}[id=FOO]
\importmodule{foo}
\begin{definition}[id=def.FOOBAR]
A \defi{bar}  is a sdfsdfsdfsdfsdfsdfsdf...
\end{definition}
...
\end{module}

the \drefi[foo?bar}{BAR} references a symbol with name foo?bar (i.e. one name=foo from the imported module module bar) and creates a new symbol FOO?BAR, which is a "re-definition" or "recap" of foo?bar (whatever that really means).

In particular, both of them are visible whenever module FOO is imported. For the glossary tools this means that both foo?bar and FOO?BAR should be found, if the course contains the snippet above.

For the following I will assume that theory foo is from the SMGloM and is of the form

\begin{module}[id=foo]
\begin{definition}[id=def.foobar]
A \drefi[foo?bar}{BAR}  is a SDFSFSDFSDFSDF
\end{definition}
...
\end{module}

BUT it is not clear what should be in the generated glossary. There are a couple of possibilities:

  • local: only - BAR: A BAR is a SDFSDFSDSDFSF..., i.e. the lemma BAR and the content of the definition def.FOOBAR; after all that is all that is really in the course.
  • source: only - BAR: a bar is a sdfsdfsfsdfsdfsdfsdf..., i.e. the lemma BAR and the content of the definition def.foobar; after all, def.foobar is just an (often abbreviated in courses) recap of def.FOOBAR (and bar and BAR are usually identical).
  • both: - BAR (see "bar"): A BAR is a SDFSDFSDSDFSF... and - BAR: a bar is a sdfsdfsfsdfsdfsdfsdf...; that may be very duplicating.

And then there is the question where we want to specify the behavior:

  • in the glossary generation program? e.g. by a switch --drefi=local/source/both for the three possibilities above.
  • in the \drefi or the enclosing definition, after all there are multiple paradigmatic uses of \drefi
    1. the local definition is a literal copy of the source definition, then source and local are identical and both makes no sense. Here sTeX should probably have some way of inputting the respective definition from SMGloM.
    2. the local definition is an abbreviation or reformulation of the source definition (but the lemmata bar and BAR are identical), e.g. the local definition is in the form of an \inlinedef, then probably the source behavior above is called for.
    3. the local definition establishes a synonym BAR for bar, then the both behavior is appropriate.

It seems the latter gives more control: we could use a variant \drefi* for i. and ii. - their effect on the glossary is the same- and \drefi for the iii. The program option --drefi=? might set a useful default behavior if no behavior is specified on the local drefis.

rename the glossary presentation macros.

I have moved the glossary presentation macros into smglom.sty, this makes dependency struture simpler. I would like to rename them:

  • smentry instead of entry
  • \smsynonymref instead of \synonmymref
  • \smjointdefref instead of \seeref

avoid duplication in glossary

instead of printing the definition body a second time, we should just do the equivalent of

\item[bar]\hypertarget{bar} sdfsdfsdfsdfsdfsdfsdfsdfsdfsdfdsf 
\item [foo] see \href{#bar}{bar}

or similar

file not found error

for some reason I get an error when generating the AI graph


couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/ml-observations-outline.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/ml-forms.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/inductive-learning.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/ml-decision-trees.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/information-theory.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/evaluation.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/learning-scale.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/clt.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/linear-regression.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/neural-networks.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/fragments/support-vector-machines.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/ml-observations-summary.tex'
couldn't find '/Users/kohlhase/localmh/MathHub/MiKoMH/AI/source/ml/fragments/ml/en/ml-xkcd.tex'

But the files are there, and LaTeX finds them. It seems that this comes when reading AI/source/ml/fragments/ml-observations.tex maybe the reason is that this is the last file before the \covereduptohere[1]?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.