linuxfabrik / lfops Goto Github PK
View Code? Open in Web Editor NEWLFOps is an Ansible Collection of generic Roles, Playbooks and Plugins for managing Linux-based Cloud Infrastructures.
Home Page: https://linuxfabrik.ch
License: The Unlicense
LFOps is an Ansible Collection of generic Roles, Playbooks and Plugins for managing Linux-based Cloud Infrastructures.
Home Page: https://linuxfabrik.ch
License: The Unlicense
new hetzner features:
CHOOSE FROM 3 NETWORKING OPTIONS
Server with two public IP addresses (IPv4 and IPv6). Server with one public IP address (IPv4 or IPv6). Server without any public IP addresses.
SERVERS WITHOUT A PUBLIC NETWORK
Adding the server to a private network will enable you to create the server without any public IPs and thus without a connection to a public network.
NEW FLEXIBILITY WIH PRIMARY IPS
After a server was created, you can still add, remove, or swap the server’s Primary IPs. To keep your Primary IP even if you delete the server it is assigned to, you can simply disable the “Auto Delete” option.
currently, the postmap command and the lineinfile are not idempotent
Happened on h164 (Rocky 8, not CentOS 7):
TASK [linuxfabrik.lfops.repo_epel : copy /tmp/ansible.RPM-GPG-KEY-EPEL-7 to /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7] changed: [h164...]
After that a filelisting on h164 shows:
ll /etc/pki/rpm-gpg/
total 16
-rw-r--r--. 1 root root 1627 2. Dez 14:29 RPM-GPG-KEY-EPEL-8
-rw-r-----. 1 root root 1723 2. Dez 14:40 RPM-GPG-KEY-ICINGA
-rw-r--r--. 1 root root 1672 22. Dez 03:25 RPM-GPG-KEY-rockyofficial
-rw-r--r--. 1 root root 1672 22. Dez 03:25 RPM-GPG-KEY-rockytesting
since the git clone task is delegated to localhost, it only runs once. this means, that if a single host has a different version set in its host_vars, all the others will get that version too.
I set up a Rocky 8 Minimal (on VMware) using the graphical Anaconda Installer, with Swiss German Keyboard and English language setting. After running basic_setup
, I get Failed to set locale, defaulting to C.UTF-8
if running a dnf list | grep php
(for example).
$ localectl
System Locale: LANG=en_US.UTF-8
VC Keymap: ch
X11 Layout: ch
On a machine without that error:
$ localectl
System Locale: LANG=en_US.UTF-8
VC Keymap: us
X11 Layout: us
Still happens after localectl set-keymap us
, localectl set-x11-keymap us
, SSH logout and login.
How about this scenario?
Is this possible (including up- and downscaling of the VM without downtime)? Could save costs and improves response times.
similar to how the infomaniak_vm role does it
user ! pw ! uid ! gid ! comment ! home_dir ! user_shell
-------------------+----+------+------+-------------------------------+-------------------------+---------------
cockpit-ws ! x ! 995 ! 993 ! User for cockpit web service ! /nonexisting ! /sbin/nologin
cockpit-wsinstance ! x ! 994 ! 992 ! User for cockpit-ws instances ! /nonexisting ! /sbin/nologin
exit with the highest error code (for monitoring of the systemd-service), but do not abort - since the folders are done in sequence
wont the included tasks always be executed? or do they need tags as well?
TASK [linuxfabrik.lfops.php : flush handlers so that the mariadb can be used by other roles later] *****************************************************************************************************
Tuesday 14 June 2022 17:41:19 +0200 (0:00:00.396) 0:00:20.368 **********
[WARNING]: flush_handlers task does not support when conditional
Today, the login
role distributes SSH keys that it knows from the host/group variables and deletes any other keys that already exist on the target system. On systems that were born before applying LFOps, this behavior causes various problems.
A switch login__aggressive_key_management: true/false
might help:
login__aggressive_key_management: true
(default): Behavior as today - delete all keys in .authorized_keys
and distribute only the defined ones.login__aggressive_key_management: false
: handle only the keys defined in the host/group variables (these then need a present
or absent
status) and leave all other keys untouched.Currently, we are partly relying on the os-specific tasks. However, this only works if we can include the tasks at the start of the role, and that is not possible for the monitoring_plugins
role.
We also are adding when
statements based on the OS in the playbooks, is that enough?
Otherwise Python 3.6 is installed, which might be too old nowadays, and you run into problems like that described in Linuxfabrik/monitoring-plugins#587
Python 3.9 preferred.
If multiple Python versions are installed in parallel, set python3
to python3.8
or python3.9
.
in plugins/module_utils/bitwarden.py.
Which causes some side effects (duplicity, glances and update-and-reboot are of course not working anymore).
Currently we are not sure what the cause is, but it happened on a machine
/usr/local/bin
Although the admin is not sure anymore if these was all he did (and in what order), we had the effect (missing PATH) on another machine a few days ago, too.
Could be Rocky 8.5, Rocky 8.6, Minimal vs. DVD image - or LFOps.
Contains readme of Ansible Role python_venv.
@NavidSassan Could be a copy/ paste issue?
pip
, wheel
, and the "main" package of that venvthere should be one script and config for all the venvs:
Is it necessary to have our own Postfix role?
similarly to our old borgbackups?
To be aware what we are working on - this is very important.
Instead of today [15:26:54 root@fw01 ~]$
it should be:
[15:26:54 root@fw01 fedora35 ~]$
[15:26:54 root@fw01 rhel8 ~]$
[15:26:54 root@fw01 ubuntu18 ~]$
[15:26:54 root@fw01 debian11 ~]$
Is it necessary to have our own sshd role?
Currently, if there is an attachment specified in the task, and there already is one uploaded with the same basename, we assume it is the same and do not change anything.
This could lead to outdated files when a server is re-installed, for example. Should we always re-upload attachments? This would lead to the task to be always changed
if there is any attachment specified. Or actually download the existing file from bitwarden and diff them (md5sum)?
TASK [linuxfabrik.lfops.sshd : semanage port --add --type ssh_port_t --proto tcp 22] **************************************************************************************
Tuesday 31 May 2022 16:41:46 +0200 (0:00:01.007) 0:01:46.800 ***********
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ModuleNotFoundError: No module named 'seobject'
fatal: [hostname]: FAILED! => changed=false
msg: Failed to import the required Python library (policycoreutils-python) on hostname's Python /usr/libexec/platform-python. Please read the module documentation and install it in the appropriate location. If the required library is installed, but Ansible is using the wrong Python interpreter, please consult the documentation on ansible_python_interpreter
causes the following error against windows hosts:
msg: The powershell shell family is incompatible with the sudo become plugin
Is it necessary to have our own kdump role?
test the role on the listed OSs or adjust the readmes
Otherwise we get in FirewallBuilder: /etc/fwb.sh: line 482: nft: command not found
Imagine a folder nextcloud/data
that is backed up in parallel (divide: True
). If at some point a subfolder is deleted, duplicity has no way to apply its retention mechanism to it, so duba has to do this directly via the Swift interface (using the same retention times).
Assume you have a duba.json with just the defaults, no further host-based duplicity__
config settings. The generated config then looks like this:
...
"backup_sources": [
{
"divide": false,
"path": "/backup"
},
{
"divide": false,
"path": "/etc"
},
{
"divide": false,
"path": "/home"
},
{
"divide": false,
"path": "/opt"
},
{
"divide": false,
"path": "/root"
},
{
"divide": false,
"path": "/var/spool/cron"
},
...
Configure a host-specific setting:
duplicity__host_backup_sources:
- path: '/opt'
divide: true
A wrong config file is generated (/opt
is configured twice with contrary values):
"backup_sources": [
{
"divide": false,
"path": "/backup"
},
{
"divide": false,
"path": "/etc"
},
{
"divide": false,
"path": "/home"
},
{
"divide": false,
"path": "/opt"
},
{
"divide": false,
"path": "/root"
},
{
"divide": false,
"path": "/var/spool/cron"
},
{
"divide": true,
"path": "/opt"
}
...
Even if you change duba.json
on the server and delete
{
"divide": false,
"path": "/opt"
},
beforehand.
Ignore these files and directories:
.cache
/boot/grub2/grubenv
/etc/pihole/pihole-FTL.db-journal
/root/.cache/duplicity
/root/.gnupg
/root/.gnupg/S.gpg-agent
/root/.gnupg/S.gpg-agent.browser
/root/.gnupg/S.gpg-agent.extra
/root/.gnupg/S.gpg-agent.ssh
/var/log/boot.log
/var/log/dmesg
/var/log/dmesg.old
/var/log/fail2ban.log
/var/log/hawkey.log
/var/log/lighttpd/error.log
/var/log/maillog
/var/log/php-fpm
/var/log/sssd/sssd.log
/var/log/sssd/sssd_implicit_files.log
/var/log/sssd/sssd_kcm.log
/var/log/sssd/sssd_nss.log
/var/log/sssd/sssd_pac.log
/var/log/sssd/sssd_pam.log
/var/log/sssd/sssd_ssh.log
/var/log/sssd/sssd_sudo.log
/var/run/utmp
Enhance so that notification-plugins are also deployed, but on Icinga masters only.
currently, the role only adds new rules, but does not remove additional rules. this makes it hard to add a temporary rule, for example
in plugins/modules/gpg_key.py
$ /home/ansible/.local/bin/ansible-galaxy collection install git+https://github.com/Linuxfabrik/lfops.git -vvv
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current
version: 3.6.8 (default, Nov 16 2020, 16:55:22) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]. This feature will be removed
from ansible-core in version 2.12. Deprecation warnings can be disabled by setting deprecation_warnings=False in
ansible.cfg.
ansible-galaxy [core 2.11.9]
config file = /home/ansible/ansible/ansible.cfg
configured module search path = ['/home/ansible/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/ansible/.local/lib/python3.6/site-packages/ansible
ansible collection location = /home/ansible/.ansible/collections:/usr/share/ansible/collections
executable location = /home/ansible/.local/bin/ansible-galaxy
python version = 3.6.8 (default, Nov 16 2020, 16:55:22) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
jinja version = 3.0.3
libyaml = True
Using /home/ansible/ansible/ansible.cfg as config file
Cloning into '/home/ansible/.ansible/tmp/ansible-local-19314c4wtoxpj/tmpfh46yqw1/lfops18sjzssa'...
remote: Enumerating objects: 4208, done.
remote: Counting objects: 100% (1039/1039), done.
remote: Compressing objects: 100% (465/465), done.
remote: Total 4208 (delta 555), reused 878 (delta 469), pack-reused 3169
Receiving objects: 100% (4208/4208), 725.49 KiB | 3.15 MiB/s, done.
Resolving deltas: 100% (1908/1908), done.
Your branch is up to date with 'origin/main'.
Starting galaxy collection install process
Found installed collection linuxfabrik.lfops:1.0.1 at '/home/ansible/.ansible/collections/ansible_collections/linuxfabrik/lfops'
Process install dependency map
Starting collection install process
Installing 'linuxfabrik.lfops:1.0.1' to '/home/ansible/.ansible/collections/ansible_collections/linuxfabrik/lfops'
Skipping '/home/ansible/.ansible/tmp/ansible-local-19314c4wtoxpj/tmpfh46yqw1/lfops18sjzssa/.git' for collection build
Skipping '/home/ansible/.ansible/tmp/ansible-local-19314c4wtoxpj/tmpfh46yqw1/lfops18sjzssa/.github' for collection build
Skipping '/home/ansible/.ansible/tmp/ansible-local-19314c4wtoxpj/tmpfh46yqw1/lfops18sjzssa/.gitignore' for collection build
Skipping '/home/ansible/.ansible/tmp/ansible-local-19314c4wtoxpj/tmpfh46yqw1/lfops18sjzssa/galaxy.yml' for collection build
ERROR! Unexpected Exception, this is probably a bug: [Errno 2] No such file or directory: b'/home/ansible/.ansible/tmp/ansible-local-19314c4wtoxpj/tmpfh46yqw1/lfops18sjzssa/plugins/modules/lib'
the full traceback was:
Traceback (most recent call last):
File "/home/ansible/.local/bin/ansible-galaxy", line 135, in <module>
exit_code = cli.run()
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/cli/galaxy.py", line 552, in run
return context.CLIARGS['func']()
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/cli/galaxy.py", line 75, in method_wrapper
return wrapped_method(*args, **kwargs)
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/cli/galaxy.py", line 1188, in execute_install
artifacts_manager=artifacts_manager,
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/cli/galaxy.py", line 1217, in _execute_install_collection
artifacts_manager=artifacts_manager,
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/galaxy/collection/__init__.py", line 539, in install_collections
install(concrete_coll_pin, output_path, artifacts_manager)
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/galaxy/collection/__init__.py", line 1079, in install
install_src(collection, b_artifact_path, b_collection_path, artifacts_manager)
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/galaxy/collection/__init__.py", line 1162, in install_src
collection_manifest, file_manifest,
File "/home/ansible/.local/lib/python3.6/site-packages/ansible/galaxy/collection/__init__.py", line 1005, in _build_collection_dir
existing_is_exec = os.stat(src_file).st_mode & stat.S_IXUSR
FileNotFoundError: [Errno 2] No such file or directory: b'/home/ansible/.ansible/tmp/ansible-local-19314c4wtoxpj/tmpfh46yqw1/lfops18sjzssa/plugins/modules/lib'
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.