[-] tofubl@discuss.tchncs.de 2 points 7 months ago

The services run on a separate box; yet to be decided on which VLAN I put it. I was not planning to have it in the DMZ but to create ingress firewall rules from the DMZ.

[-] tofubl@discuss.tchncs.de 2 points 9 months ago

You can easily host the community edition in Docker or otherwise. Odoo has a steep learning curve but it's very versatile. It can definitely do what you describe.

[-] tofubl@discuss.tchncs.de 2 points 9 months ago

OSMC on a rpi3 with a hifiberry+ has served me well for many years. Most things just work, even passthrough TV remote over i2c if the TV supports it (brand name for the implementation varies by TV manufacturer I think). My setup has been really slow in recent months, but I probably just need a new sd card... Streaming service integration in kodi isn't perfect but e.g. Netflix works well enough.

It's a bit of tinkering to get it just the way you want it, but not too much and then it's great with a lot of flexibility. I have slapped an IR LED onto a GPIO, for example, and I have a service running that checks for audio output and turns my old hifi system on and off accordingly.

[-] tofubl@discuss.tchncs.de 2 points 9 months ago* (last edited 9 months ago)

i times i is -1, though. Imagine that!

[-] tofubl@discuss.tchncs.de 2 points 9 months ago

1000014418 1000014416 1000014417

The docker01 alias is a host alias with 10.0.0.22 and there's an apache test container running on port 8888.

I have created a pass any in rule on WAN (just until I figure out what's wrong)

In firewall > settings > advanced, I have set "reflection for port forwards" and "automatic outbound Nat for reflection" although I'm not sure if that is needed.

Is there any other info I can provide?

[-] tofubl@discuss.tchncs.de 2 points 10 months ago

Here's a cool article I found on Nextcloud performance improvements, and connecting Redis over Unix sockets gave me a more substantial performance improvement than migrating to Postgres. Very happy I fell down this rabbit hole today.

To note if you're following the tutorial in the link above, and for people using the nextcloud:stable container together with the recommended cron container:

  • the redis configuration (host, port, password, ...) need to be set in config/config.php, as well as config/redis.config.php
  • the cron container needs to receive the same /etc/localtime and /etc/timezone volumes the app container did, as well as the volumes_from: tmp
[-] tofubl@discuss.tchncs.de 2 points 10 months ago* (last edited 10 months ago)

Interesting. Do you remember where you read this?

The process seems simple enough. I'm on the nextcloud:stable docker image, so adding a postgres container is really easy, but it's a scary task...

[-] tofubl@discuss.tchncs.de 2 points 10 months ago* (last edited 10 months ago)

A very welcome compliment after this ordeal. Thank you! :)

The cleanest way to solve the scaling issue would probably be to go into the pdf2pic module and hack it open to accept the "pcl:fit-to-page" option that GraphicsMagick (the underlying software package doing the actual conversion from PDF to PCL) supports. (Supposing it actually does what it says. I'm not so sure about anything in printer land anymore.)

But since this whole thing is for internal documents only and the scaling can probably be estimated by choosing better values for width/height to account for printer margins I most likely won't bother.

Thanks again for suggesting Node-RED. I'm very happy with the result.

[-] tofubl@discuss.tchncs.de 2 points 10 months ago* (last edited 10 months ago)

One deep dive into the IPP protocol, printer drivers, and specific supported formats ("what the hell is an octet-stream?!") later, I have something functional.

I'm not a JS guy, so I don't know if it's a very node-y way to do it. But except for the pages coming out a little scaled this works:

Edit to briefly explain what this does: The mail node ingests unread emails and passes them forward to a function node that checks for pdf attachments. Those are passed forward one by one to a function node that takes care of converting the pdf to pcl (my printer supposedly knows how to handle pdf 1.7 but doesn't, so I had to resort to this) and passes it to the IPP node as a data buffer .

spoiler[ { "id": "86082bed0ed29155", "type": "IPPrint", "z": "08cc7c15668f2f65", "name": "Print to Network", "IP": "10.10.0.19:631/ipp/print", "JOB_name": "default_job_name", "authuser": "", "authpassword": "", "authcheck": "", "x": 780, "y": 120, "wires": [ [ "b1a8b92ede8a78a5" ] ] }, { "id": "b1a8b92ede8a78a5", "type": "debug", "z": "08cc7c15668f2f65", "name": "debug 1", "active": true, "tosidebar": true, "console": false, "tostatus": false, "complete": "true", "targetType": "full", "statusVal": "", "statusType": "auto", "x": 960, "y": 120, "wires": [] }, { "id": "1fb7fc3843af8e05", "type": "function", "z": "08cc7c15668f2f65", "name": "Convert PDF to PCL", "func": "const baseOptions = {\n width: 2480,\n height: 3508,\n density: 300,\n preserveAspectRatio: true,\n format: 'PCL'\n};\n\n// prepare conversion function\nconst convert = await pdf2pic.fromBuffer(msg.payload, baseOptions);\n// set additional conversion options\nconvert.setGM\n// bulk convert the whole input buffer to a pcl output buffer\nconvert.bulk(-1, {responseType: \"buffer\"}).then((outputs)=>{\n // pass each page on to the printer\n outputs.forEach((output, index) => {\n msg.payload = output.buffer;\n if(msg.topic){\n let jobname = msg.topic\n }\n msg.JOBName = (msg.topic ? msg.topic : \"Mailjob\").concat(\" \", index+1, \"/\", outputs.length);\n msg.docFormat = \"application/octet-stream\";\n node.send(msg);\n });\n});\n", "outputs": 1, "timeout": 0, "noerr": 0, "initialize": "", "finalize": "", "libs": [ { "var": "pdf2pic", "module": "pdf2pic" } ], "x": 560, "y": 120, "wires": [ [ "86082bed0ed29155" ] ] }, { "id": "224317e794da3cb5", "type": "e-mail in", "z": "08cc7c15668f2f65", "name": "Mail ingest", "protocol": "IMAP", "server": "test.mail.com", "useSSL": false, "autotls": "always", "port": "143", "authtype": "BASIC", "saslformat": true, "token": "oauth2Response.access_token", "box": "INBOX", "disposition": "Read", "criteria": "UNSEEN", "repeat": "300", "fetch": "auto", "inputs": 0, "x": 160, "y": 80, "wires": [ [ "2fedaa18c9394b89" ] ] }, { "id": "2fedaa18c9394b89", "type": "function", "z": "08cc7c15668f2f65", "name": "PDF Filter", "func": "if(msg.attachments.length > 0){\n msg.attachments.forEach(function(attachment) {\n if(attachment.contentType == \"application/pdf\"){\n var newmsg = {};\n newmsg.topic = msg.topic;\n newmsg.filename = attachment.filename;\n newmsg.payload = attachment.content;\n node.send(newmsg);\n }\n });\n}", "outputs": 1, "timeout": 0, "noerr": 0, "initialize": "", "finalize": "", "libs": [], "x": 360, "y": 80, "wires": [ [ "1fb7fc3843af8e05" ] ] }, { "id": "3d8f7ad8e6b56d46", "type": "inject", "z": "08cc7c15668f2f65", "name": "Test print", "props": [ { "p": "payload" }, { "p": "topic", "vt": "str" } ], "repeat": "", "crontab": "", "once": false, "onceDelay": 0.1, "topic": "", "payload": "", "payloadType": "date", "x": 160, "y": 160, "wires": [ [ "683ed6fe7c10540d" ] ] }, { "id": "683ed6fe7c10540d", "type": "file in", "z": "08cc7c15668f2f65", "name": "", "filename": "/data/2517.pdf", "filenameType": "str", "format": "", "chunk": false, "sendError": false, "encoding": "none", "allProps": false, "x": 340, "y": 160, "wires": [ [ "1fb7fc3843af8e05" ] ] } ]

[-] tofubl@discuss.tchncs.de 2 points 11 months ago* (last edited 11 months ago)

Now this is an idea I like. Thank you so much, I never would have considered Node-RED for this application!

[-] tofubl@discuss.tchncs.de 2 points 1 year ago

Snapcast works incredibly well for multi-device audio.

Has anyone tried setting up multiple zones with it that can play different things at the same time? I imagine you would need one snapcast server per zone? And is there an easy way to assign the clients to one of the servers?

[-] tofubl@discuss.tchncs.de 2 points 1 year ago

Yeah, I have reached that point yesterday. Gave it one last shot with MS' own system image tool and same results. The upside - doing a system recovery with the "keep my data" option actually worked okay. User data, ssh keys and so on still there, and an HTML file on the Desktop with a list of all the apps that were removed. Could have been worse... Thanks again for the help!

view more: ‹ prev next ›

tofubl

joined 1 year ago