[-] AK1174@alien.top 1 points 11 months ago

well a web server is a pen-testable thing, and is also a very common pen-tested thing so the background knowledge is useful .

[-] AK1174@alien.top 1 points 11 months ago

all of the above + more???

[-] AK1174@alien.top 1 points 1 year ago

you could use LocalAI or ollama. but neither is going to work with 300mb of ram, and it needs a bunch compute resources for response speed to be usable. these models are also not very capable, in comparison to openAI’s gpt’s, but that depends on what your goal is with the models.

AK1174

joined 1 year ago