anycoder-d74b2e62 / index.html
matthewspring's picture
Upload folder using huggingface_hub
3e6cad5 verified
raw
history blame
8.51 kB
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Hardened AI Workstation | Built with anycoder</title>
<style>
:root {
--primary: #1a1a2e;
--secondary: #16213e;
--accent: #0f3460;
--text: #e6e6e6;
--highlight: #00d4ff;
}
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: 'Courier New', monospace;
background-color: var(--primary);
color: var(--text);
line-height: 1.6;
padding: 2rem;
}
.container {
max-width: 1200px;
margin: 0 auto;
}
header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 2rem;
padding-bottom: 1rem;
border-bottom: 1px solid var(--accent);
}
.logo {
font-size: 1.5rem;
font-weight: bold;
}
.anycoder-link {
color: var(--highlight);
text-decoration: none;
}
.anycoder-link:hover {
text-decoration: underline;
}
.card {
background-color: var(--secondary);
border-radius: 8px;
padding: 2rem;
margin-bottom: 2rem;
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
}
h1, h2, h3 {
color: var(--highlight);
margin-bottom: 1rem;
}
.code-block {
background-color: #0a0a1a;
border-radius: 4px;
padding: 1rem;
overflow-x: auto;
font-family: monospace;
font-size: 0.9rem;
margin: 1rem 0;
}
.security-badge {
display: inline-block;
background-color: var(--highlight);
color: var(--primary);
padding: 0.3rem 0.8rem;
border-radius: 4px;
font-size: 0.8rem;
font-weight: bold;
margin-left: 0.5rem;
}
.feature-list {
list-style-type: none;
margin: 1rem 0;
}
.feature-list li {
margin-bottom: 0.5rem;
padding-left: 1.5rem;
position: relative;
}
.feature-list li:before {
content: "βœ“";
color: var(--highlight);
position: absolute;
left: 0;
}
footer {
margin-top: 2rem;
padding-top: 1rem;
border-top: 1px solid var(--accent);
text-align: center;
font-size: 0.9rem;
}
@media (max-width: 768px) {
body {
padding: 1rem;
}
.card {
padding: 1rem;
}
}
</style>
</head>
<body>
<div class="container">
<header>
<div class="logo">
Hardened AI Workstation
<span class="security-badge">MAX SECURITY</span>
</div>
<a href="https://huggingface.co/spaces/akhaliq/anycoder" class="anycoder-link" target="_blank" rel="noopener noreferrer">
Built with anycoder
</a>
</header>
<main>
<div class="card">
<h1>Hardened Qwen 3 Local AI Solution</h1>
<p>This implementation provides a completely local, hardened AI environment with read-only access to the model files.</p>
</div>
<div class="card">
<h2>Docker Implementation</h2>
<p>Here's the complete Docker setup for running Qwen 3 in an Alpine container with read-only access:</p>
<div class="code-block">
# Dockerfile for Hardened Qwen 3<br>
FROM alpine:latest<br><br>
# Install minimal dependencies<br>
RUN apk add --no-cache \<br>
python3 \<br>
py3-pip \<br>
&& pip3 install --no-cache-dir \<br>
torch \<br>
transformers \<br>
sentencepiece \<br>
&& rm -rf /var/cache/apk/*<br><br>
# Create read-only volume for model<br>
VOLUME /model<br>
RUN mkdir -p /model && chmod 400 /model<br><br>
# Set working directory<br>
WORKDIR /app<br><br>
# Copy application files<br>
COPY app.py .<br>
COPY requirements.txt .<br><br>
# Install Python dependencies<br>
RUN pip3 install --no-cache-dir -r requirements.txt<br><br>
# Security hardening<br>
RUN chmod 500 /app && \<br>
chmod 400 /app/app.py && \<br>
chmod 400 /app/requirements.txt<br><br>
# Run as non-root user<br>
RUN adduser -D -s /bin/sh aiuser && \<br>
chown -R aiuser:aiuser /app<br><br>
USER aiuser<br><br>
# Read-only filesystem<br>
CMD ["sh", "-c", "mount -o remount,ro / && python3 /app/app.py"]
</div>
</div>
<div class="card">
<h2>Security Features</h2>
<ul class="feature-list">
<li>Alpine Linux base for minimal attack surface</li>
<li>Read-only filesystem after initialization</li>
<li>Non-root user execution</li>
<li>Minimal package installation</li>
<li>No internet access required</li>
<li>Model files mounted as read-only volume</li>
<li>Strict file permissions (400 for sensitive files)</li>
<li>No shell access in production</li>
<li>All dependencies pinned to specific versions</li>
<li>Automatic cleanup of cache files</li>
</ul>
</div>
<div class="card">
<h2>Python Application</h2>
<p>The main application file (app.py) for running the hardened Qwen 3 model:</p>
<div class="code-block">
import os<br>
import sys<br>
from transformers import AutoModelForCausalLM, AutoTokenizer<br><br>
# Security checks<br>
def security_checks():<br>
# Verify read-only filesystem<br>
if not os.access('/', os.W_OK):<br>
print("βœ“ Filesystem is read-only")<br>
else:<br>
print("βœ— Filesystem is writable - security risk!")<br>
sys.exit(1)<br><br>
# Verify model directory exists and is readable<br>
if os.path.exists('/model') and os.access('/model', os.R_OK):<br>
print("βœ“ Model directory accessible")<br>
else:<br>
print("βœ— Model directory not accessible")<br>
sys.exit(1)<br><br>
# Initialize model<br>
def init_model():<br>
try:<br>
# Load model from read-only location<br>
model = AutoModelForCausalLM.from_pretrained(<br>
'/model/qwen3',<br>
trust_remote_code=False,<br>
local_files_only=True<br>
)<br><br>
tokenizer = AutoTokenizer.from_pretrained(<br>
'/model/qwen3',<br>
trust_remote_code=False,<br>
local_files_only=True<br>
)<br><br>
print("βœ“ Model loaded successfully")<br>
return model, tokenizer<br>
except Exception as e:<br>
print(f"βœ— Model loading failed: {str(e)}")<br>
sys.exit(1)<br><br>
# Main execution<br>
if __name__ == "__main__":<br>
print("Starting Hardened Qwen 3 AI...")<br>
security_checks()<br>
model, tokenizer = init_model()<br><br>
# Your application logic here<br>
print("AI ready for local inference")<br>
</div>
</div>
<div class="card">
<h2>Deployment Instructions</h2>
<ol>
<li>Build the Docker image:
<div class="code-block">
docker build -t hardened-qwen3 .
</div>
</li>
<li>Run the container with model volume:
<div class="code-block">
docker run -d \<br>
--name qwen3-ai \<br>
-v /path/to/qwen3-model:/model:ro \<br>
--read-only \<br>
--network none \<br>
--cap-drop=ALL \<br>
hardened-qwen3
</div>
</li>
<li>Verify security:
<div class="code-block">
docker exec qwen3-ai sh -c "mount | grep 'on / ro'"
</div>
</li>
</ol>
</div>
<div class="card">
<h2>Additional Hardening</h2>
<p>For maximum security, consider these additional measures:</p>
<ul class="feature-list">
<li>Use Docker content trust for image verification</li>
<li>Sign your Docker images with cosign</li>
<li>Run in a dedicated user namespace</li>
<li>Use seccomp profiles to restrict syscalls</li>
<li>Enable AppArmor or SELinux policies</li>
<li>Regularly scan for vulnerabilities with trivy</li>
<li>Use immutable tags for production images</li>
<li>Implement runtime security monitoring</li>
<li>Store model files in encrypted volumes</li>
<li>Use hardware security modules for key management</li>
</ul>
</div>
</main>
<footer>
<p>Β© 2023 Hardened AI Workstation. All rights reserved.</p>
<p>This implementation provides enterprise-grade security for local AI deployment.</p>
</footer>
</div>
</body>
</html>