Url proxy bypass

r/TPB

2009.04.27 03:17 newnetmp3 r/TPB

[link]


2008.03.20 13:18 Comedy

A place where Comedy meets Comedians
[link]


2013.06.12 01:56 xJRWR PirateProxy: A List of Piratebay Proxy Domains

This subreddit is temporarily private as part of a joint protest to Reddit's recent API changes, which breaks third-party apps and moderation tools, effectively forcing users to use the official Reddit app. https://www.reddit.com/ModCoord/comments/1476fkn/reddit_blackout_2023_save_3rd_party_apps/
[link]


2024.05.29 04:41 Ambitious-Summer-958 Streamlining Business Operations

In today's fast-paced business environment, efficiency is key to success. With increasing competition and rapidly evolving technologies, businesses need to constantly adapt and optimize their operations to stay ahead of the curve. One way to achieve this is by leveraging cutting-edge tools and technologies that can streamline processes and improve productivity.
IPRockets is a premium IP proxy service that offers a wide range of benefits for businesses looking to enhance their online operations. With IPRockets' free proxy list, businesses can access a variety of proxy servers from around the world, allowing them to mask their IP addresses and browse the internet anonymously. This can be particularly useful for businesses that need to access geo-restricted content or websites that may be blocked in their region.
By using IPRockets' free proxy list, businesses can also improve their security and protect their data from potential cyber threats. Proxy servers act as intermediaries between the user and the internet, helping to shield sensitive information from hackers and other malicious actors. This can be especially important for businesses that handle confidential data or financial transactions online.
In addition to enhancing security, IPRockets' free proxy list can also boost businesses' efficiency by reducing latency and improving connection speeds. By routing traffic through proxy servers, businesses can bypass congestion and optimize their online performance. This can be crucial for businesses that rely on real-time data and need to stay competitive in today's fast-paced digital landscape.
Furthermore, IPRockets' free proxy list can help businesses save money on their online operations. By using proxy servers to access content and services from different locations, businesses can avoid costly international fees and access discounts that may be available in specific regions. This can be particularly advantageous for businesses that operate on a global scale and need to manage multiple online accounts and services.
Overall, leveraging IPRockets' free proxy list can provide businesses with a competitive edge by streamlining their operations and enhancing their online performance. With its advanced features, security benefits, and cost-saving potential, IPRockets is a valuable tool for businesses looking to optimize their online presence and drive efficiency across their operations. By taking advantage of this innovative technology, businesses can stay ahead of the curve and achieve long-term success in today's rapidly changing business landscape.
submitted by Ambitious-Summer-958 to u/Ambitious-Summer-958 [link] [comments]


2024.05.29 04:40 Wild_Instruction_819 Supply Chain Confidentiality in Web3

Introduction:
In the age of Web 3.0, where transparency and decentralization are key principles of the digital world, maintaining supply chain confidentiality has become more challenging than ever. With the rise of blockchain technology and smart contracts, ensuring the privacy and security of supply chain data has become a top priority for businesses around the globe. In this article, we will explore the importance of supply chain confidentiality in Web3 and how leveraging premium IP proxies like IPRockets can enhance privacy and security in supply chain management.
The Importance of Supply Chain Confidentiality in Web3:
Supply chains are complex networks of interconnected entities, including suppliers, manufacturers, distributors, and retailers. These entities exchange a vast amount of sensitive information, including product designs, pricing, inventory levels, and customer data. In the digital age, this information is often stored and transmitted online, making it susceptible to cyberattacks and data breaches.
Maintaining supply chain confidentiality is essential for protecting intellectual property, trade secrets, and competitive advantage. A breach of confidentiality can result in financial losses, reputational damage, and legal liabilities. In Web3, where data is stored on decentralized platforms and accessed through smart contracts, the risk of unauthorized access and data leaks is even higher.
IPRockets: A Premium IP Proxy for Supply Chain Privacy:
IPRockets is a premium IP proxy service that provides businesses with secure and anonymous access to the internet. With a global network of high-speed servers, IPRockets allows users to mask their IP addresses and encrypt their online activities, ensuring maximum privacy and security.
By using IPRockets, businesses can protect their supply chain data from prying eyes and cyber threats. With encrypted connections and secure protocols, IPRockets prevents unauthorized access to sensitive information and prevents data leaks. In addition, IPRockets allows businesses to bypass geo-restrictions and access restricted websites and services, making it ideal for international supply chain management.
Benefits of Leveraging IPRockets for Supply Chain Confidentiality:
There are several benefits to leveraging IPRockets for supply chain confidentiality in Web3. Some of the key advantages include:
  1. Enhanced Privacy: With IPRockets, businesses can protect their supply chain data from hackers, competitors, and government surveillance. By masking their IP addresses and encrypting their online activities, businesses can ensure the confidentiality of their sensitive information.
  2. Improved Security: IPRockets employs advanced security protocols, such as AES encryption and SSL/TLS encryption, to safeguard supply chain data from cyber threats. By using IPRockets, businesses can prevent data breaches and unauthorized access to their systems.
  3. Geo-Bypassing: IPRockets allows businesses to bypass geo-restrictions and access international websites and services. This is particularly useful for supply chain management, where cross-border communication and collaboration are essential.
  4. Cost-Effective: Compared to traditional VPN services, IPRockets offers competitive pricing and flexible plans, making it a cost-effective solution for businesses of all sizes.
Conclusion:
In conclusion, supply chain confidentiality is essential for businesses operating in Web3. By leveraging premium IP proxies like IPRockets, businesses can enhance privacy and security in their supply chain management. With encrypted connections, secure protocols, and global server networks, IPRockets provides businesses with the tools they need to protect their sensitive information and maintain a competitive edge in the digital age.
submitted by Wild_Instruction_819 to u/Wild_Instruction_819 [link] [comments]


2024.05.29 04:33 AdditionalHandle3594 Using Yuyu Proxy: A Free Proxy Guide

Yuyu Proxy offers a complimentary web proxy service designed to provide users with a straightforward, rapid, and cost-effective. Acting as an intermediary between your device and the targeted website, it enables web browsing without disclosing your actual IP address. This proxy service allows access to popular websites such as Google, YouTube, Facebook, TikTok, and Instagram, all while safeguarding user privacy and security.

Advantages of Employing Yuyu Proxy

Anonymity: The Yuyu Proxy service serves as a unique tool for anonymous web surfing. It ensures the confidentiality of your online activities by functioning as a barrier between your device and the websites you visit. This barrier conceals your true identity, including your IP address, thus making it difficult for external parties to track your online behavior.
Accessibility: One of the appealing aspects of Yuyu Proxy is that it is offered free of charge. However, due to its free nature, it has certain limitations, such as slower connection speeds and restricted access to specific websites. For instance, it cannot be used for web scraping, creating multiple social media accounts, or performing tasks requiring proxies with higher standards. Overall, it is an excellent choice for basic browsing requirements.
Ease of Use: This web proxy does not require any software downloads or installations. Simply visit the website and input the URL you wish to access. Its user-friendly interface ensures that even individuals with limited internet knowledge can navigate it effortlessly.

Here’s a step-by-step guide on how to use it

Step 1: Visit the Yuyu Proxy Website

  1. Open Your Browser: Launch your preferred web browser on your device.
  2. Navigate to the Website: Enter https://www.yuyuproxy.com into your browser’s address bar and press Enter.

Step 2: Enter the URL

  1. Locate the Input Box: On the Yuyu Proxy homepage, you will find an input box.
  2. Input the Desired URL: Type in the URL of the website you wish to access. For example, you can type in www.duckduckgo.com for a test run.
  3. Use Quick Links: Alternatively, you can use the quick links provided below the input box for commonly used websites.

Step 3: Access the Website

  1. Click “Go”: After entering the desired URL, click the “Go” button.
  2. Wait for Loading: Wait for the loading indicator to signify that the proxy is processing your request.

Step 4: Verify Your IP Address Change

  1. Check the Result: Once the website loads, verify that your real IP address has been masked.
  2. Search for IP Information: For confirmation, search for “IPinfo” on your chosen website (e.g., DuckDuckGo) and visit the top result.
  3. Compare IP Addresses: Ensure that the displayed IP address is different from your actual one.

Step 5: Enjoy Anonymous Browsing

  1. Browse Confidently: Continue browsing the web anonymously and securely.
  2. Mobile Use: You can also follow the same steps on your mobile browser for seamless access.

Tips

Alternatives to Yuyu Proxy

For advanced proxy needs, consider OkeyProxy, a leading proxy provider offering residential and datacenter proxies. Catering to personal and business requirements, OkeyProxy provides solutions for social networks, account creation, accessing blocked sites, and web scraping.
submitted by AdditionalHandle3594 to u/AdditionalHandle3594 [link] [comments]


2024.05.28 23:31 deffcolony How to setup GitLab Pages with Traefik? getting 404 page not found

how can i configure the gitlab pages without dns wildcard correctly using docker + traefik + cloudflare?
I have created a A record for pages on cloudflare dns that points to my public ip where it goes into traefik (thats why you see 404 page not found)
https://preview.redd.it/2x2icwhwj83d1.png?width=494&format=png&auto=webp&s=98531c0c2b4a83d1ef3e8e5daf6caf7a7d6b55c6
so now traefik has to correctly point this sub domain into gitlab but i dont know to how configure this in the fileconfig.yml of traefik it needs to correctly redirect so the sub domain pages gets connected with my selfhosted gitlab at gitlab.DOMAIN. COM
this is my current config:
docker-compose.yml
version: "3.8" services: gitlab-runner: image: gitlab/gitlab-runner:alpine container_name: gitlab-runner volumes: - /varun/docker.sock:/varun/docker.sock - ./gitlab-runner:/etc/gitlab-runner restart: unless-stopped depends_on: - web web: image: gitlab/gitlab-ce:latest container_name: gitlab-ce hostname: gitlab.DOMAIN.COM environment: GITLAB_OMNIBUS_CONFIG: external_url "https://gitlab.DOMAIN.COM" nginx['listen_https'] = false nginx['redirect_http_to_https'] = false nginx['listen_port'] = 80 letsencrypt['enable'] = false # GitLab Pages pages_external_url "https://pages.DOMAIN.COM" gitlab_pages['access_control'] = true gitlab_pages['namespace_in_path'] = true gitlab_pages['enable'] = true pages_nginx['enable'] = true pages_nginx['listen_https'] = false pages_nginx['redirect_http_to_https'] = true pages_nginx['listen_port'] = 5100 pages_nginx['proxy_set_headers'] = {"X-Forwarded-Proto" => "https","X-Forwarded-Ssl" => "on"} volumes: - ./config:/etc/gitlab - ./logs:/valog/gitlab - ./data:/vaopt/gitlab ports: - 8225:80 # - 8226:443 # - 5005:5005 - 5100:5100 # - 22:22 # - 587:587 restart: unless-stopped 
This is my traefik fileconfig.yml
 # Gitlab router gitlab-ce: entryPoints: - https rule: 'Host(`gitlab.DOMAIN.COM`)' service: gitlab-ce tls: certResolver: cloudflare domains: - main: "gitlab.DOMAIN.COM" sans: - "*.gitlab.DOMAIN.COM" - "*.pages.DOMAIN.COM" middlewares: - gitlab-redirectscheme # GitLab - Pages router pages: entryPoints: - websecure rule: 'Host(`pages.DOMAIN.COM`)' service: pages tls: certResolver: cloudflare domains: - main: gitlab.DOMAIN.COM sans: - '*.gitlab.DOMAIN.COM' - '*.pages.DOMAIN.COM' middlewares: - pages-redirectscheme # Gitlab service gitlab-ce: loadBalancer: passHostHeader: true servers: - url: http://192.168.x.x:8225 # GitLab - Pages service pages: loadBalancer: passHostHeader: true servers: - url: http://192.168.x.x:5100 # GitLab redirect scheme middleware gitlab-redirectscheme: redirectScheme: scheme: https permanent: false # Pages redirect scheme middleware pages-redirectscheme: redirectScheme: scheme: https permanent: false 
submitted by deffcolony to gitlab [link] [comments]


2024.05.28 22:30 Temporary_Noise_4014 Publication of Results of Pre-Clinical Studies Support Efficacy and Drug Delivery Mechanism Potential of RenovoRx’s TAMP™ Therapy Platform to Improve Targeted Cancer Drug Treatment Deliver (Nasdaq: RNXT)

Publication of Results of Pre-Clinical Studies Support Efficacy and Drug Delivery Mechanism Potential of RenovoRx’s TAMP™ Therapy Platform to Improve Targeted Cancer Drug Treatment Deliver (Nasdaq: RNXT)

https://preview.redd.it/yixn7qly883d1.png?width=325&format=png&auto=webp&s=e5578caf1a1ea4f284cf6e1b658d4ef0e5557ef5
Data shows that the Trans-Arterial Micro-Perfusion (TAMP) platform increases intra-arterial pressure, improving drug delivery with 100-fold increase local tissue concentration of the therapy
TAMP offers the potential to increase efficacy, improve safety and widen therapeutic window of drugs or other agents
LOS ALTOS, CA – May 21, 2024RenovoRx, Inc. – (“RenovoRx” or the “Company”) (Nasdaq: RNXT), a clinical-stage biopharmaceutical company developing novel precision oncology therapies based on a local drug-delivery platform, today announced a publication of pre-clinical studies supporting the efficacy and drug delivery mechanism of RenovoRx’s Trans-Arterial Micro-Perfusion (“TAMP”)therapy platform. The data was published online in the peer-reviewed Journal of Vascular Interventional Radiology(“JVIR”) journal and will also be published in the print version.
The manuscript is authored by Khashayar Farsad, MD, PhD of the Department of Interventional Radiology at Oregon Health and Science University, and co-authored by Paula M. Novelli, MD, of the University of Pittsburgh Hillman Cancer Center, together with other researchers, including RenovoRx’s Chief Medical Officer, Dr. Ramtin Agah. Access the JVIR abstract: https://pubmed.ncbi.nlm.nih.gov/38508449/.
Currently, most cancer patients with solid tumors receive chemotherapy intravenously, meaning it is introduced systemically into the entire body and causes well known adverse side effects. RenovoRx’s patented TAMP therapy platform is designed to bypass traditional systemic delivery methods and provide precise delivery to bathe the target solid tumor in chemotherapy. This precise delivery also creates the potential to minimize a therapy’s systemic toxicities.
The pre-clinical data published in JVIR showed a 100-fold (two orders of magnitude) increase in local tissue concentration of the therapy with TAMP compared to conventional intravenous (IV) delivery. TAMP also showed advantages compared to historically available intra-arterial (IA) delivery approaches. TAMP’s novel approach to treatment offers the potential to increase an oncology therapy’s efficacy, improve safety, and widen its therapeutic window by focusing its distribution uniformly in target tissue.
“TAMP has the potential to provide a valuable treatment option to patients who have been diagnosed with solid tumors that may be difficult-to-treat,” said Dr. Farsad. “The study shows a possible mechanism for how TAMP can increase local therapeutic tissue concentration in solid tumors that is independent from traditional catheter-directed therapy. We are awaiting final outcomes of the Phase III clinical trial, currently underway, to validate this benefit.”
Dr. Farsad adds, “This platform has the potential to extend across a variety of unmet needs for localized therapeutic drug delivery.”
About the Phase III TIGeR-PaC Clinical Trial
TIGeR-PaC is RenovoRx’s ongoing Phase III randomized multi-center study evaluating the proprietary TAMP therapy platform for the treatment of Locally Advanced Pancreatic Cancer (LAPC.) RenovoRx’s first product candidate, RenovoGem™, is a novel oncology drug-delivery combination utilizing TAMP administration technology combined with the FDA-approved chemotherapy, gemcitabine. The TIGeR-PaC study is comparing treatment with TAMP to systemic intravenous chemotherapy, the current standard of care.
The first interim analysis in the TIGeR-PaC study occurred at the 26th event of the specified events (deaths), and was completed in March 2023, with the Data Monitoring Committee recommending a continuation of the study. The TIGeR-PaC study’s primary endpoint is a 6-month Overall Survival (OS) benefit with secondary endpoints including reduced side effects versus standard of care.
About Locally Advanced Pancreatic Cancer (LAPC)
According to American Cancer Society’s Cancer Facts & Figures 2023, pancreatic cancer has a 5-year combined overall survival rate of 13% (Stages I-IV) and is on track to be the second leading cause of cancer-related deaths before 2030. LAPC is diagnosed when the disease has not spread far beyond the pancreas, however, has advanced to the point where it cannot be surgically removed. LAPC is typically associated with patients in Stage 3 of the disease as determined by the TNM (tumor, nodes and metastasis) grading system.
About RenovoRx, Inc.
RenovoRx is a clinical-stage biopharmaceutical company developing novel precision oncology therapies based on a proprietary local drug-delivery platform for high unmet medical need with a goal to improve therapeutic outcomes for cancer patients undergoing treatment. RenovoRx’s patented Trans-Arterial Micro-Perfusion (TAMP™) therapy platform is designed to ensure precise therapeutic delivery to directly target the tumor while potentially minimizing a therapy’s toxicities versus systemic intravenous therapy. RenovoRx’s novel and patented approach to targeted treatment offers the potential for increased safety, tolerance, and improved efficacy. Our Phase III lead product candidate, RenovoGem™, a novel oncology drug-device combination product, is being investigated under a U.S. investigational new drug application that is regulated by the FDA’s 21 CFR 312 pathway. RenovoGem is currently being evaluated for the treatment of locally advanced pancreatic cancer by the Center for Drug Evaluation and Research (the drug division of FDA.)
RenovoRx is committed to transforming the lives of patients by delivering innovative solutions to change the current paradigm of cancer care. RenovoGem is currently under investigation for TAMP therapeutic delivery of gemcitabine and has not been approved for commercial sale.
For more information, visit www.renovorx.com. Follow RenovoRx on Facebook, LinkedIn, and Twitter.
submitted by Temporary_Noise_4014 to 10xPennyStocks [link] [comments]


2024.05.28 19:51 Sharvin95 Tailscale + Nginx for https, am i doing it right?

Hi,
I'm running an unraid machine with NPM docker and tailscale plugin. Both runs well. I wanted to improve security and also use certain apps such as vaultwarden which requires only https. I use sudo tailscale cert to create the cert and used it in NPM as a custom ssl cert and forced https and everything runs smoothly.
But i get an error SSL_ERROR_BAD_CERT_DOMAIN due to *tailscale*.ts.net is different name from the domain which i reverse proxy from.
I'm no networking expert so i don't understand it well enough now but how do i bypass this error or am i doing it right ?
I mostly just wanted to use vaultwarden without using the cloudflare method to do https (which i have already achieved) but currently just wanted to fix the SSL issue if possible or understand the issue further.
submitted by Sharvin95 to selfhosted [link] [comments]


2024.05.28 14:41 nisebblumberg Issue with AVProVideo -- Almost nobody affected but me! Just my luck...

So, any time I log into a world there is a total dice roll as to whether or not the video player works. What's upsetting is I will have all my friends in a world where the video fails to synchronize, and I have spent DAYS trying to resolve this issue, and I am currently out of options. I've even reached out to movie world makers and VR chat support who have been no help. I honestly have nowhere else to go save for reformatting my machine (seems so intense for this tiny issue)
Here is everything I tried so far. The issue is that any video player running AVProVideo will load to about 70%, then immediately give an error (in appdata logs) as you see in the title: [AVProVideo] Error: Loading failed. File not found, codec not supported, video resolution too high or insufficient system resources. It's been a giant thorn in my side and I have so far done everything people have suggested, but I come up short handed. I honestly have no idea what could be wrong at this point. Here is what I attempted: I'm running Windows 10 22H2. I have an NVidia RTX 4070ti. I have a Radeon 3700X. 32GB RAM. I'm not sure what else to try, or any other potential issues. I have strong belief at this point the issue has to do with AVProVideo somehow. Video players would work using other players on VRchat but NOT for AVProVideo specifically.
  1. Update video card drivers to latest WHQL.
  2. windows update to full -> sfc /scannow -> dism.exe /online /cleanup-image /restorehealth
  3. checked yt-dlp program using procmon (yt-dlp is the tool used by VRchat to actively download files onto your computer, play on the video player, then delete in real time)(procmon is an advanced microsoft utility to see what a program is touching at any point in time on a system)
  4. Switched computer over to separate mobile network to see if it was ISP block. (it wasnt)
  5. Moved from SteamVR to using AirLink on Oculus and bypass SteamVR by installing VRChat from there. Same issue.
  6. Run VRChat in administrative mode.
  7. Completely remove VRchat (including registry, %appdata%, etc) and reinstall.
  8. Log out VRchat, log back in
  9. NVidia control panel -> disable vsync on both VRChat and Global.
  10. Attempted replacing yt-dlp.exe with latest version from github. (VRChat overwrites this, it is also not suggested, and I did make it read-only to make it where it could not be overwritten. It still does not work unfortunately)
  11. Additional VRChat fix: go into certificates and delete old google certificates. No change.
  12. Updated Java.
  13. Manually moved ffmpeg into relevant location yt-dlp.exe was looking for through procmon. (no fix)
  14. Ensured I don't have an antivirus program blocking download of files. Disabled firewall too.
  15. Disabled voicemeeter (used to play sounds from my browser into VRchat, neat!) no fix.
  16. Switching VRchat from moving from an HDD to an SSD, no change.
  17. ...enabled untrusted URLs (I cannot tell you how many times I just check this just for fun at this point)
Does anyone have any idea to add? I'm just out of options save for reformat, and I really, really don't want to have to do that.
submitted by nisebblumberg to VRchat [link] [comments]


2024.05.28 11:45 Sad-Truck-2375 WTFProxy Rotating residential proxies for your scraping needs

WTFproxy is a reliable and versatile proxy service offering high-quality HTTP(S) proxies for various needs. Whether it's ensuring privacy, bypassing geo-blocking, or automating web scraping, WTFproxy provides flexible solutions. Their diverse network covers multiple countries and cities, ensuring a smooth and stable connection at all times. Additionally, they offer user-friendly tools and fast customer support, ensuring an excellent user experience. If you're looking for a trustworthy proxy service that meets your needs, WTFproxy is definitely the best choice.
https://www.wtfproxy.com/?ref=j5qDhmNeBEpdh0ewQ_eBJ
submitted by Sad-Truck-2375 to WTFProxy [link] [comments]


2024.05.28 10:56 thetschulian Ivanti MobileIron MDM Checkin / Profil installation leads into Proxy Error

Hey there,
we are currently migrating our MobileIron MDM. Basic Setup etc went well.
Our OLD system (mdm.company.de) was directly natted with a public IP.
Our NEW system (epmm.company.de) is beeing published with a Sophos WAF. We want a higher security level thats why we put a Sophos WAF between the Internet and the on Premise Appliance.
We can visit the reg page on the new system via epmm.company.de/go After entering the Username and Password, the Profil is downloaded. So far so good.
If I try to install the Profile now, we get a 502 Proxy Error. I can see the following URL is beeing contacted: https://epmm.company.de/mifs/c/i/mdm/checkin.html
Additional Info: If I open https://epmm.company.de/mifs/c/i/ it works (page not exist response from the epmm server) -> but as soon as I add /mdm to the URL (https://epmm.company.de/mifs/c/i/mdm) I get an browser error "ERR_BAD_SSL_CLIENT_AUTH_CERT" I think thats the reason why the Sophos WAF responses with an "502 proxy error"
If I try it for example with our working system (mdm.company.de/mifs/c/i/mdm/checkin.html) I get a "valid" response This method/operation is not allowed.
I guess there is an internal redirecting error as soon as the directiv /mdm is added. Maybe I missed a checkbox which needs to be enabled?
submitted by thetschulian to sysadmin [link] [comments]


2024.05.28 09:33 AussieHyena IIS 10 Reverse Proxy to Tomcat

So, I know that the standard approach is to use isapi_redirect to create the connection over AJP between IIS and Tomcat. But we've found today that may no longer be correct.
Our setup has ARR and UrlRewrite installed and we were proxying requests to a Tomcat 9 instance on another server via isapi_redirect. In our non-prod environments we were able to successfully connect by a normal UrlRewrite rule.
Are we missing something? Or does anyone know if/when this changed? My suspicion is that it is related to the introduction of HttpPlatformHandler.
submitted by AussieHyena to WindowsServer [link] [comments]


2024.05.28 08:43 ThunderKai_ Buyi using Superbuy

I was looking to buy a figure and the cheapest place I could find her was through superbuy.
https://m.superbuy.com/en/goodsdetail/?url=https%3A%2F%2Fdetail.tmall.com%2Fitem.htm%3Fid%3D671845798573%26ali_trackid%3D2%3Amm_119784642_132400295_108545700400#!
I have never used a proxy before so I wanted to know if anyone has ever used this service before and if it is safe or actually going to end up cheaper than purchasing for a higher price elsewhere.
submitted by ThunderKai_ to AnimeFigures [link] [comments]


2024.05.28 05:53 AdditionalHandle3594 Resolving the Instagram Open Proxy Error

What Does Open Proxy Mean on Instagram?

Instagram has stringent rules regarding its usage, and encountering an "Open Proxy" error indicates that you won't be able to access the app or website from the device using the flagged IP address. This typically means Instagram has banned your IP, rejecting all data requests sent from it.

What May Cause This?

  1. Excessive liking, commenting, or following/unfollowing accounts.
  2. An IP address associated with past malicious activities on the platform.
  3. Using public Wi-Fi that has been exploited for malicious purposes.
  4. Unrecognized login attempts or posting unusual content.
  5. Clicking on links from unfamiliar sources.

How to Fix Instagram Open Proxy Error?

The first step is to try unblocking access directly through Instagram support, though success is often limited, especially if your actions still violate the platform's rules. Another option is using an Instagram proxy. An Instagram proxy acts as an intermediary server, masking your device’s IP address and providing an additional layer of privacy and security. This setup allows your device to communicate with Instagram via the proxy server, which forwards your requests on your behalf.

The Power of Instagram Unblock Proxy

Beyond resolving IP bans, Instagram proxies offer various advantages, such as managing multiple accounts, growing followers, and scraping information. Here’s what you can do with Instagram unblock proxies:
  1. Access Unblocked: Proxies can help bypass restrictions in workplaces, educational institutions, or countries. While VPNs provide comprehensive security and privacy, proxies are faster as they do not encrypt the entire internet connection.
  2. Manage Multiple Accounts: Instagram allows up to five accounts per user. Proxies are essential for handling more than a few accounts, preventing your IP from being flagged by Instagram.
  3. Grow Followers: Multiple accounts with unique profiles and content strategies can attract more followers. While this approach may not be entirely ethical, adhering to Instagram’s guidelines ensures sustainable growth.

How to Use Instagram Proxy Effectively?

  1. Assign One Account per Proxy: Allocate just one account to a distinct proxy to prevent potential banning issues. Choose high-quality, reputable proxy providers with IP rotation and advanced security features.
  2. Scrape Slowly: Use web scraping tools or custom scripts in programming languages like Python. Mimic human behavior by adding random pauses between requests and limiting concurrent requests.
  3. Avoid Free Proxies: Free proxies are often unreliable and insecure. They are usually overcrowded, slow, and may not provide privacy or anonymity. Known free proxy IP addresses are actively blocked by Instagram.
OkeyProxy is a top choice for Instagram proxies. Offering rotating residential, static residential, and data center proxies, OkeyProxy’s rotating residential proxies, obtained from authentic residential users, make your Instagram bot nearly identical to real users, allowing you to automate tasks without the risk of being blocked.
submitted by AdditionalHandle3594 to u/AdditionalHandle3594 [link] [comments]


2024.05.28 02:38 e11ipsism Can not focus camera

Hello I have tried many variations of v4l2ctl: parameters and values and I cannot get it to work. Camera is on but blurry. I spent the whole day trying and troubleshooting shooting. Here is one conf and log.

crowsnest.conf

This is a typical default config.

Also used as default in mainsail / MainsailOS

See:

https://github.com/mainsail-crew/crowsnest/blob/masteREADME.md

for details to configure to your needs.

Information about ports and according URL's

Port 8080 equals /webcam/?action=[stream/snapshot]

Port 8081 equals /webcam2/?action=[stream/snapshot]

Port 8082 equals /webcam3/?action=[stream/snapshot]

Port 8083 equals /webcam4/?action=[stream/snapshot]

Note: These ports are default for most Mainsail

installations. To use any other port would involve

changing the proxy configuration or using directly

http://:/?action=[stream/snapshot]

RTSP Stream URL: ( if enabled and supported )

rtsp://:/stream.h264

[crowsnest] log_path: /home/pi/printer_data/logs/crowsnest.log log_level: verbose # Valid Options are quiet/verbose/debug delete_log: false # Deletes log on every restart, if set to true no_proxy: false
[cam 1] mode: camera-streamer # ustreamer - Provides mjpg and snapshots. (All devices) # camera-streamer - Provides webrtc, mjpg and snapshots. (rpi + Raspi OS based only) enable_rtsp: false # If camera-streamer is used, this enables also usage of an rtsp server rtsp_port: 8554 # Set different ports for each device! port: 8080 # HTTP/MJPG Stream/Snapshot Port device: /base/soc/i2c0mux/i2c@1/imx519@1a # See Log for available ... resolution: 640x480 # widthxheight format max_fps: 15 # If Hardware Supports this it will be forced, otherwise ignored/coerced.

custom_flags: # You can run the Stream Services with custom flags.

v4l2ctl: AfMode=2
Crows nest.log
[05/27/24 18:41:27] crowsnest: crowsnest - A webcam Service for multiple Cams and Stream Services. [05/27/24 18:41:27] crowsnest: Version: v4.1.9-1-gd75a3ae [05/27/24 18:41:27] crowsnest: Prepare Startup ... [05/27/24 18:41:27] crowsnest: INFO: Host information: [05/27/24 18:41:27] crowsnest: Host Info: Distribution: Debian GNU/Linux 11 (bullseye) [05/27/24 18:41:27] crowsnest: Host Info: Release: MainsailOS release 1.3.2 (bullseye) [05/27/24 18:41:27] crowsnest: Host Info: Kernel: Linux 6.1.21-v8+ aarch64 [05/27/24 18:41:27] crowsnest: Host Info: Model: Raspberry Pi Zero 2 W Rev 1.0 [05/27/24 18:41:27] crowsnest: Host Info: Available CPU Cores: 4 [05/27/24 18:41:27] crowsnest: Host Info: Available Memory: 366036 kB [05/27/24 18:41:27] crowsnest: Host Info: Diskspace (avail. / total): 106G / 116G [05/27/24 18:41:27] crowsnest: INFO: Checking Dependencies [05/27/24 18:41:27] crowsnest: Dependency: 'crudini' found in /usbin/crudini. [05/27/24 18:41:27] crowsnest: Dependency: 'find' found in /usbin/find. [05/27/24 18:41:27] crowsnest: Dependency: 'xargs' found in /usbin/xargs. [05/27/24 18:41:27] crowsnest: Dependency: 'ustreamer' found in bin/ustreamesrc/ustreamer.bin. [05/27/24 18:41:27] crowsnest: Dependency: 'camera-streamer' found in bin/camera-streamecamera-streamer. [05/27/24 18:41:27] crowsnest: Version Control: ustreamer is up to date. (v5.48) [05/27/24 18:41:28] crowsnest: Version Control: camera-streamer is up to date. ((f1627aa)) [05/27/24 18:41:28] crowsnest: INFO: Print Configfile: '/home/pi/printer_data/config/crowsnest.conf' [05/27/24 18:41:28] crowsnest: [crowsnest] [05/27/24 18:41:28] crowsnest: log_path: /home/pi/printer_data/logs/crowsnest.log [05/27/24 18:41:28] crowsnest: log_level: verbose [05/27/24 18:41:28] crowsnest: delete_log: false [05/27/24 18:41:28] crowsnest: no_proxy: false [05/27/24 18:41:28] crowsnest: [05/27/24 18:41:28] crowsnest: [cam 1] [05/27/24 18:41:28] crowsnest: mode: camera-streamer [05/27/24 18:41:28] crowsnest: [05/27/24 18:41:28] crowsnest: enable_rtsp: false [05/27/24 18:41:28] crowsnest: rtsp_port: 8554 [05/27/24 18:41:28] crowsnest: port: 8080 [05/27/24 18:41:28] crowsnest: device: /base/soc/i2c0mux/i2c@1/imx519@1a [05/27/24 18:41:28] crowsnest: resolution: 640x480 [05/27/24 18:41:28] crowsnest: max_fps: 15 [05/27/24 18:41:28] crowsnest: v4l2ctl: AfMode (int): min=0 max=2 [05/27/24 18:41:28] crowsnest: INFO: Detect available Devices [05/27/24 18:41:31] crowsnest: INFO: Found 1 total available Device(s) [05/27/24 18:41:32] crowsnest: Detected 'libcamera' device -> /base/soc/i2c0mux/i2c@1/imx519@1a [05/27/24 18:41:32] crowsnest: 'libcamera' device(s) resolution(s) : [05/27/24 18:41:32] crowsnest: 0 : imx519 4656x3496 10-bit RGGB [05/27/24 18:41:32] crowsnest: Colorspace: 'SRGGB10_CSI2P' : 1280x720 [80.01 fps - (1048, 1042)/2560x1440 crop] [05/27/24 18:41:32] crowsnest: 1920x1080 [60.05 fps - (408, 674)/3840x2160 crop] [05/27/24 18:41:32] crowsnest: 2328x1748 [30.00 fps - (0, 0)/4656x3496 crop] [05/27/24 18:41:32] crowsnest: 3840x2160 [18.00 fps - (408, 672)/3840x2160 crop] [05/27/24 18:41:32] crowsnest: 4656x3496 [9.00 fps - (0, 0)/4656x3496 crop]
[05/27/24 18:41:32] crowsnest: [05/27/24 18:41:33] crowsnest: 'libcamera' device controls : [05/27/24 18:41:33] crowsnest: ColourGains (float): min=0.000000 max=32.000000 [05/27/24 18:41:33] crowsnest: AfMetering (int): min=0 max=1 [05/27/24 18:41:33] crowsnest: AnalogueGain (float): min=1.000000 max=16.000000 [05/27/24 18:41:33] crowsnest: Saturation (float): min=0.000000 max=32.000000 [05/27/24 18:41:33] crowsnest: Contrast (float): min=0.000000 max=32.000000 [05/27/24 18:41:33] crowsnest: AeMeteringMode (int): min=0 max=3 [05/27/24 18:41:33] crowsnest: AfMode (int): min=0 max=2 [05/27/24 18:41:33] crowsnest: AeConstraintMode (int): min=0 max=3 [05/27/24 18:41:33] crowsnest: AeEnable (bool): min=false max=true [05/27/24 18:41:33] crowsnest: ExposureTime (int): min=282 max=118430097 [05/27/24 18:41:33] crowsnest: AfPause (int): min=0 max=2 [05/27/24 18:41:33] crowsnest: AfRange (int): min=0 max=2 [05/27/24 18:41:33] crowsnest: NoiseReductionMode (int): min=0 max=4 [05/27/24 18:41:33] crowsnest: Sharpness (float): min=0.000000 max=16.000000 [05/27/24 18:41:33] crowsnest: AwbEnable (bool): min=false max=true [05/27/24 18:41:33] crowsnest: ExposureValue (float): min=-8.000000 max=8.000000 [05/27/24 18:41:33] crowsnest: AwbMode (int): min=0 max=7 [05/27/24 18:41:33] crowsnest: AeExposureMode (int): min=0 max=3 [05/27/24 18:41:33] crowsnest: AfSpeed (int): min=0 max=1 [05/27/24 18:41:33] crowsnest: AfTrigger (int): min=0 max=1 [05/27/24 18:41:33] crowsnest: Brightness (float): min=-1.000000 max=1.000000 [05/27/24 18:41:33] crowsnest: LensPosition (float): min=0.000000 max=32.000000 [05/27/24 18:41:35] crowsnest: [05/27/24 18:41:35] crowsnest: Try to start configured Cams / Services... [05/27/24 18:41:39] crowsnest: INFO: Configuration of Section [cam 1] looks good. Continue ... [05/27/24 18:41:41] crowsnest: ... Done! [05/27/24 18:41:43] crowsnest: V4L2 Control: Handling done by camera-streamer ... [05/27/24 18:41:43] crowsnest: V4L2 Control: Trying to set: AfMode (int): min=0 max=2 [05/27/24 18:41:43] crowsnest: Starting camera-streamer with Device /base/soc/i2c0mux/i2c@1/imx519@1a ...
Any replies would be appreciated.
submitted by e11ipsism to klippers [link] [comments]


2024.05.28 00:27 Accurate-Strike-6771 pdh.dll fails to install with Winetricks

Hello,
I've been trying to set up XDefiant, but it keeps on crashing when entering a lobby. After a quick search, I found that I needed to install pdh.dll, however it first gives me this message when I try to install:
Checksum for /home/person/.cache/winetricks/win7sp1/windows6.1-KB976932-X86.exe did not match, retrying download
After I click "OK", it gives me this message:
SHA256 mismatch!
URL: http://download.windowsupdate.com/msdownload/update/software/svpk/2011/02/windows6.1-kb976932-x86_c3516bc5c9e69fee6d9ac4f981f5b95977a8a2fa.exe
Downloaded: b15b13ff1d4c2eb07723c208a4546cf24911ea80a9799e6969ea727b2866e2e3
Expected: e5449839955a22fc4dd596291aff1433b998f9797e1c784232226aba1f8abd97
This is often the result of an updated package such as vcrun2019.
If you are willing to accept the risk, you can bypass this check.
Alternatively, you may use the --force option to ignore this check entirely.
Continue anyway?
When I click "yes", it finally gives me this message:
Note: command cabextract -q -d /home/person/.wine/dosdevices/c:/windows/temp -L -F x86_microsoft-windows-p..rastructureconsumer_31bf3856ad364e35_6.1.7601.17514_none_b5e3f88a8eb425e8/pdh.dll /home/person/.cache/winetricks/win7sp1/windows6.1-KB976932-X86.exe returned status 1. Aborting.
Any help would be appreciated! I am using OpenSUSE Tumbleweed.
submitted by Accurate-Strike-6771 to linux4noobs [link] [comments]


2024.05.27 23:07 usrdef nginx location rules for subdirectories

I am setting up phpmyadmin.
I have the subdomain working fine, via phpmyadmin.domain.com, however, I wanted to also add domain.com/phpmyadmin
After many attempts with trial and error, I came up with this:
location ^~ /phpmyadmin/ { proxy_set_header Host $host; proxy_set_header X-Forwarded-For $remote_addr; proxy_set_header X-Forwarded-Host $server_name; proxy_set_header X-Real-IP $remote_addr; proxy_pass http://172.18.0.12/; }
Other attempts would return things like 404 errors, or if you didn't add a trailing / and just used /phpmyadmin, you would get a white page, yet /phpmyadmin/ worked.
The issue with the rule above is that if I go to https://domain.com/phpmyadmin it asks me to sign into phpmyadmin, great.
After I sign in, it redirects me to https://domain.com and not the subdirectory, which should be https://domain.com/phpmyadmin
So then I have to edit the URL in the browser and append /phpmyadmin to the end so that I can go back to the page I was on, and then it works fine. I'm signed in.
Edit: I found a solution for this issue by using location ^~ /phpmyadmin/ { proxy_set_header Host $host; proxy_set_header X-Forwarded-For $remote_addr; proxy_set_header X-Forwarded-Host $server_name; proxy_set_header X-Real-IP $remote_addr; proxy_pass http://172.18.0.12/; proxy_redirect ~/(.+) /phpmyadmin/$1; }
I appended the last line at the end: proxy_redirect ~/(.+) /phpmyadmin/$1;
But I'm questioning if all of this is necessary.
Right now I have all of this running on docker, with the following containers: - mariadb - php 8 - phpmyadmin - nginx
All containers have their own IP addresses, and I've read that you can summon other docker containers by using the docker container name, but I can't seem to get that working. So I had to use the manually assigned IP of the phpmyadmin container as shown above.
When I attempted to use the docker container, I added the following:
upstream docker-pma { server phpmyadmin:80; }
phpmyadmin being the name of the docker container.
And then inside my server rule:
location ^~ /phpmyadmin/ { proxy_pass http://docker-pma; }
And that just returns
``` Not Found
The requested URL was not found on this server. ```
And yes, within docker, I have assigned all the containers to the same network. phpmyadmin, nginx, php, mariadb.
Nginx, phpmyadmin, and mariadb docker logs show no errors, and that everything is operating normally.
submitted by usrdef to nginx [link] [comments]


2024.05.27 22:15 Accurate-Strike-6771 pdh.dll fails to install with Winetricks

Hello,
I've been trying to set up XDefiant, but it keeps on crashing when entering a lobby. After a quick search, I found that I needed to install pdh.dll, however it first gives me this message when I try to install:
Checksum for /home/person/.cache/winetricks/win7sp1/windows6.1-KB976932-X86.exe did not match, retrying download
After I click "OK", it gives me this message:
SHA256 mismatch!
URL: http://download.windowsupdate.com/msdownload/update/software/svpk/2011/02/windows6.1-kb976932-x86_c3516bc5c9e69fee6d9ac4f981f5b95977a8a2fa.exe
Downloaded: b15b13ff1d4c2eb07723c208a4546cf24911ea80a9799e6969ea727b2866e2e3
Expected: e5449839955a22fc4dd596291aff1433b998f9797e1c784232226aba1f8abd97
This is often the result of an updated package such as vcrun2019.
If you are willing to accept the risk, you can bypass this check.
Alternatively, you may use the --force option to ignore this check entirely.
Continue anyway?
When I click "yes", it finally gives me this message:
Note: command cabextract -q -d /home/person/.wine/dosdevices/c:/windows/temp -L -F x86_microsoft-windows-p..rastructureconsumer_31bf3856ad364e35_6.1.7601.17514_none_b5e3f88a8eb425e8/pdh.dll /home/person/.cache/winetricks/win7sp1/windows6.1-KB976932-X86.exe returned status 1. Aborting.
Any help would be appreciated! I am using OpenSUSE Tumbleweed.
submitted by Accurate-Strike-6771 to linux_gaming [link] [comments]


2024.05.27 18:06 InevitableOld3322 Gitea server behind swag reverse proxy issues

Hi all,
I am currently trying to set up a gitea server that I can access from anywhere. I got the server up and running and edited the gitea subdomain template that is provided by swag for reverse proxy (https://github.com/linuxservereverse-proxy-confs/blob/mastegitea.subdomain.conf.sample). Also added the server config in the app.ini of gitea as mentioned at the top.
While I can access the webpage just fine through https, I am having issues with issuing a git clone of a repository. It would go through the webgui for authentication and asks to authorize the session fine, but then redirects to a 127.0.0.1 url which throws the error ERR_SSL_PROTOCOL_ERROR.
Anyone who had the same issue who figured out what it was?
Some things I tried myself:
Also set up a LAN only test server to check its not related to gitea itself, but that seems to clone just fine.
Thx in advance
submitted by InevitableOld3322 to unRAID [link] [comments]


2024.05.27 17:29 Marry_06 My apache is not communicating with django as a backend

Hello,
I have the following architecture:
My apache configuration is like this
 ServerName www.myapplication.com ServerAlias myapplication.com # Frontend ProxyPass ProxyPass / http://192.168.10.3:3000/ ProxyPassReverse / http://192.168.10.3:3000/ # Backend ProxyPass ProxyPass /api/ http://192.168.10.3:8000/api/ ProxyPassReverse /api/ http://192.168.10.3:8000/api/ ErrorLog ${APACHE_LOG_DIR}/cftappsec_error.log CustomLog ${APACHE_LOG_DIR}/cftappsec_access.log combined  
When i access the url www.myapplication.com/ i get the login page it works fine ( apache reacting with front is OK )
when i try to test login i get 404 error ans i analyzed the apache is not reaching the backend which i find it weird because the frond and back are on the same server.
i tried to make Firewall rules to allow all type of traffic coming from apache to application server and vis versa ==> i still get the 404
from the apache server i runned the curl commande
So at this stage i confimrmed that my apache is not communicating with the backend.
Bellow the login page axios which i may thing is responsible for this issue.
login.js from React:
 axios.post('http://www.myapplication.com/api/token/', user, { headers: { 'Content-Type': 'application/json' }, }) .then(response => { const { data } = response; console.log("DATA :", data); localStorage.clear(); localStorage.setItem('access_token', data.access); localStorage.setItem('refresh_token', data.refresh); axios.defaults.headers.common['Authorization'] = `Bearer ${data.access}` navigate('/Dashboard'); }) .catch(error => { console.error("Erreur lors de la soumission du formulaire :", error); }); }; 
for the urls.py
urlpatterns = [ ...................... path('token/', TokenObtainPairAndRefreshView.as_view(), name='token_obtain_pair'), path('token/refresh/', TokenRefreshView.as_view(), name='token_refresh'), ] 
for the views.py => i have few doupts about the get method but when i changed it to post it's still giving me 404 error
class TokenObtainPairAndRefreshView(TokenObtainPairView): def get(self, request, *args, **kwargs): # Allow GET method for token obtain return super().post(request, *args, **kwargs) class TokenRefreshView(BaseTokenRefreshView): def get(self, request, *args, **kwargs): # Allow GET method for token refresh return super().post(request, *args, **kwargs) class MyTokenObtainPairView(TokenObtainPairView): serializer_class = MyTokenObtainPairSerializer 
for the settings.py
CORS_ALLOWED_ORIGINS = [ "http://localhost:3000", # Frontend server address "http://127.0.0.1:8000", #used for local test "http://myapplication.com", ] CORS_ORIGIN_ALLOW_ALL = True ALLOWED_HOSTS = [ 'localhost', '127.0.0.1', '*', ] 
and the patters are well set too
urlpatterns = [ path('admin/', admin.site.urls), path('', include('MyApp.urls')), ] 
Does anyone have any idea ? maybe changing some setting when moving the code from local to the VM affected the backend code ?
submitted by Marry_06 to django [link] [comments]


2024.05.27 16:55 ThinRizzie Seen asking for zip url when playing video

Seen asking for zip url when playing video
Installed kodi directly on my google tv from the built-in play store. Then I installed seren and configure it with the a4kScrapers, my alldebrid account, and my trakt account. There were a few hiccups, but things seemed to be working well.
When I tried to play a test stream, it worked! … kinda. I can hear the audio and such, but the video just says ‘seren: enter zip url’. Since this is in the viewport, I can’t interact with it at all. I can’t seem to find any setting that will allow me to set a default here or bypass it.
Has anyone else encountered this? Is there a way around it? Thanks in advance.
submitted by ThinRizzie to Addons4Kodi [link] [comments]


2024.05.27 15:29 _MirrorMask_ Another merch bot gone

Another merch bot gone
Though it's good that UMG is trying out some solutions to improve the security of the store, I wonder if this will help out genuine fans or just give more advantage to scalpers/resellers who can afford a bypass system.
submitted by _MirrorMask_ to TaylorSwiftMerch [link] [comments]


2024.05.27 09:41 Huge_Line4009 VPN vs Proxy in 2024: Which is the Better Choice for You?

Hey Reddit!
The digital landscape is evolving, and with it, our needs for privacy and security online. Today, I'm diving deep into the ongoing debate: VPN vs Proxy—which one should you be using in 2024? I've gathered some up-to-date info, practical examples, and some solid data to help you make an informed decision.
What’s the Difference?
First things first, let’s clarify what we’re talking about:
Use Cases
To understand which tool is best for you, consider these scenarios:
  1. Streaming: Want to watch content from another country? A VPN is generally better because many streaming services like Netflix detect and block proxies.
  2. Browsing Anonymously: If you're just looking to hide your IP address while surfing the web casually, a simple proxy might suffice. But remember, it doesn't offer encryption.
  3. Security: If security is a priority (like when you're on public Wi-Fi), VPNs are superior due to their encryption capabilities.
Performance and Speed
Cost and Ease of Use
Data and Privacy
When it comes to data and privacy, not all services are created equal. Here’s a quick breakdown:
2024 Trends and Considerations
Examples
Conclusion
Both VPNs and proxies have their place in your internet toolkit, but your choice should depend on what you value more: complete privacy and security or speed and simplicity? In most security-conscious scenarios, VPNs tend to be the better option due to their encryption and broader feature sets.
What do you all think? Have you had better experiences with VPNs or proxies, especially in recent updates?
submitted by Huge_Line4009 to PrivatePackets [link] [comments]


2024.05.27 09:11 danishkirel On-Demand Local Inference on my gaming pc

Edit: FFS editing killed all the formatting - I hope I fixed it again
I set up something fun for myself yesterday after I grew dissatisfied with my original local llm project. I started with a Ryzen APU with 64GB DDR5 Ram which gave reasonable generation speeds for models up to 14B BUT my main goal was at some point be able to replace cloud services in my smart home and with large context windows (passing in all my smart home in structured format) showed the shortcomings of the setup: with 4k context it took a minute to start generating. So I benchmarked against my windows gaming PC with a 3070TI. That only has 8GB of VRAM but it's so fast in comparison for something like llama3:8b. That got me thinking: Can I spin up the gaming PC with ollama running on demand when I try to call the ollama API?
And indeed! In a nutshell I now set up a reverse proxy on my homelab that proxies to my gaming PC but if that isn't available it wakes it up and and holds the connection until it becomes available. I just use OLLAMA_HOST=ollama-pc.lan or configure this url in frontends and tools. It spins up from hibernate in just a few seconds and goes back to sleep after some minutes of inactivity. Works much better than I hoped it would.
Next step: Upgrade my Gaming PC's graphics card to a 3090? Let's see.
This is a rough guide how I set it up if anyone wants to follow:
  1. I set up my PC for wake on lan and passwordless login (already did that before - it's headless anyway sitting in the closed and I game using moonlight from Steam Deck and Virtual Desktop from Quest 3)
  2. Added ollama-pc.lan to my local DNS server (Pihole) pointing to my homelab's IP (I have an Intel NUC running Home Assistant and a few other things)
  3. Recompile Caddy server (which I use for reverse proxying local dns names to docker containers) with some plugins:
``` FROM caddy:builder AS builder
RUN xcaddy build \ --with github.com/dulli/caddy-wol \ --with github.com/abiosoft/caddy-exec
FROM caddy:latest
RUN apk --no-cache add curl
COPY --from=builder /usbin/caddy /usbin/caddy ```
  1. create a Wake on Lan switch in Home assistantswitch:
switch: - platform: wake_on_lan name: gaming pc host: 192.168.50.10 mac: "xx:xx:xx:xx:xx:xx"
  1. Create a webhook automation in Home Assistant to spin up the Gaming PC
alias: raw webhook description: "" trigger: - platform: webhook allowed_methods: - POST - PUT local_only: true webhook_id: gaming_pc_on condition: [] action: - service: switch.turn_on metadata: {} data: {} target: entity_id: switch.gaming_pc mode: single
  1. add the following section to the Caddyfile
http://ollama-raw.lan:11434 { reverse_proxy http://192.168.50.10:11434 tls internal log handle_errors { @502 expression {err.status_code} == 502 handle @502 { #wake_on_lan 2C:F0:5D:95:DD:14 # doesn't work see below exec curl -vk -XPOST https://homeassistant.lan/api/webhook/gaming_pc_on { pass_thru } reverse_proxy http://192.168.50.10:11434 { lb_try_duration 120s } } } }
  1. set the Gaming PC to hibernate aufter x minutes
One remark: I tried to use wake on lan from caddy directly but I run caddy in docker so I can't broadcast the WOL on the host network (unless in `network_mode: host` which didn't work for me - that's what makes the Home Assistant shenanigans necessary. If you run a reverse proxy on bare metal you may not need that.
submitted by danishkirel to LocalLLaMA [link] [comments]


http://swiebodzin.info