Unlimited scroll tumblr html code

Cookie Clicker

2013.08.11 17:13 ReubenMcHawk_ Cookie Clicker

A subreddit for the popular cookie-clicking game.
[link]


2014.11.19 03:48 $1 at a time

If a million people gave a dollar to someone, they could be a millionaire.
[link]


2010.11.26 12:17 AstralPro Torment: Tides of Numenera

Discussion about Torment: Tides of Numenera
[link]


2024.05.19 02:45 Frogdude13 Visible (by Verizon) Promo code 3BPN62R == $5 first month of unlimited service

All of the info you need is at: https://www.visible.com/get/3BPN62R
Use my friend code 3BPN62R during checkout and you’ll get your first month of service for only $5 w/ FREE activation.
(Before finalizing your order, double check to make sure that 3BPN62R was applied to make sure you get the discount!)
submitted by Frogdude13 to VisibleDiscountCodes [link] [comments]


2024.05.19 02:37 Apprehensive_Share93 Is it feasible to learn coding and develop personal projects over the summer for internship applications?

I am an upper sophomore computer science major with no coding experience. Is it possible to learn how to code, perhaps focusing on Python, JavaScript, HTML, and CSS, and build personal projects during the summer? My goal is to apply for software development internships (or maybe start off with front-end development) during my junior year of college, but I'm unsure if this is a realistic plan.
submitted by Apprehensive_Share93 to learnprogramming [link] [comments]


2024.05.19 02:21 BurritoSuicide Stuck on PCI Configuration Begin

Stuck on PCI Configuration Begin
As the title implies, I'm having trouble passing through a single GPU to macOS. I'm currently using OSX-KVM, found at https://github.com/kholia/OSX-KVM, and I was able to get everything running fine in virtmanager, but it seems to hate my graphics card.
System specs (arch btw):
https://preview.redd.it/dvw8oadrz91d1.png?width=722&format=png&auto=webp&s=275f63ef1c81c346315900f2ead0b3a2761c3efd
I know the GPU isn't currently compatible, but I have seen other proxmox/reddit forums where OP got theirs running using NootRX.
What I've tried:
My config.plist can be found at:
https://github.com/BurritoSuicide/OSX-Repo-Personal-
Virtmanager XML:
 macOS 2aca0dd6-cec9-4717-9ab2-0b7b13d111c3 macOS 4194304 4194304 16  hvm /home/scryv/OSX-KVM/OVMF_CODE.fd /home/scryv/OSX-KVM/OVMF_VARS.fd       qemu64        destroy restart restart  /usbin/qemu-system-x86_64      
My hooks seem to work fine, as I have a win10 VM that passes through perfectly. My guess is that it's something to do with virtmanager, but I could be wrong.
submitted by BurritoSuicide to hackintosh [link] [comments]


2024.05.19 02:10 RyoAshikara An unnecessarily large beginners guide to Dharāṇī and Paritta Text (Sorry I couldn’t make it any shorter!)

What exactly are Dharāṇī?
Dharāṇī, also known as Parittas, are Buddhist chants, mnemonic codes, incantations, or recitations, usually the mantras consisting of Sanskrit or Pāli phrases. Believed to be protective and with powers to generate merit for the Buddhist devotee, they constitute a major part of historic Buddhist literature. Many of these chants are in Sanskrit and Pāli, written in scripts such as Siddhaṃ, as well as transliterated into Chinese, Korean, Japanese, Vietnamese, Sinhala, Thai and other regional scripts. Dharāṇī are found in the ancient texts of all major traditions of Buddhism. They are a major part of the Pāli canon preserved by the Theravāda tradition. Mahāyāna Sūtras such as the Lotus Sūtra and the Heart Sūtra include or conclude with Dharāṇī. Dharāṇī are a part of the regular ritual prayers as well as considered to be an amulet and charm in themselves, whose recitation believed to allay bad luck, diseases or other calamity. In some Buddhist regions, they served as texts upon which the Buddhist witness would swear to tell the truth. Dharāṇī recitation for the purposes of healing and protection is referred to as Paritta in some Buddhist regions, such as Laos, Thailand, Burma, Cambodia, and Sri Lanka. Paritta is generally translated as ‘Safeguard’ or ‘Protection’ in the Pāli language.
Historical Context:
The word Dharāṇī derives from a Sanskrit root √dhṛ meaning "to hold or maintain". Some Buddhist communities outside India sometimes refer to Dharāṇī with alternate terms such as "Mantra, Hṛdaya (Hridiya), Paritrana (Paritta), Raksha (Pali: Rakkha), Gutti, or Vidyā" though these terms also have other contextual meanings in Buddhism. The Buddhist Dharāṇī invocations are the earliest mass printed texts that have survived. The earliest extant example of printing on paper is a fragment of a Dhāraṇī miniature scroll in Sanskrit unearthed in a tomb in Xi'an, called the Great spell of unsullied pure light (Wúgòu jìng guāngdà tuóluóní jīng; 無垢淨光大陀羅尼經). It was printed using woodblock during the Tang dynasty, c. 650–670 AD. The Hyakumantō Darani found as charms in wooden pagodas of Japan were broadly accepted as having been printed between 764 and 770 CE. In 1966, similarly printed Dharāṇī were discovered in stone pagoda of Pulguksa temple in Gyeongju, Korea. These are dated to the first half of the 8th century.
How to start the practice:
As stated, Dharāṇī, are used as a sort of mnemonic code, specifically curated to help a practitioner remember the text in which the teaching and incantation comes from, such a practice is a good starting point in exploring the genre of Buddhist text that have the ability to generate positive karma, and dedication of merit to other sentient beings. Remembering and reciting a Dharāṇī is useful, and is a good recommendation for beginners, linked here is also a guide on how to pronounce Sanskrit if you happen to have some linguistic difficulties:
https://youtube.com/playlist?list=PLnFLN_eBOBMVEWX7pGJMGJNr_HK75VL_9&si=wSLN1Pz95q9bTcNA
Please note as some Dharāṇī/Paritta are esoteric and passed down by lineage of teacher to student, it is highly recommended for beginners to receive oral transmissions of such text. However, in this short list, all listed text are Sūtric and are free to recite without need of oral transmission.
Here is a good beginners list:
(Sanskrit works) Saddharma Puṇḍarīka Sūtraṁ (White Lotus Sūtra, used for praising the White Lotus Sūtra, and its benefits.)
Nīlakaṇṭha Dharāṇī (Blue-Necked One Dharāṇī, used for honoring, venerating, and requesting Avalokiteśvara protection, and clearance of obstacles.)
Śūraṅgama Mantraḥ/Sitātapatroṣṇīṣa Dhāraṇī* (Śūraṅgama Sūtra Mantraḥ, and the Dharāṇī associated in the Śūraṅgama Sūtra; White Parosal Dharāṇī of Sitātapatra Dharmapāla. Used for honoring, and venerating, the Śūraṅgama Sūtra, as well as requesting the help of Sitātapatra Dharmapāla to combat negative spiritual forces, magic, and beings.)
Bhaiṣajya-Guru-Vaiḍūrya-Prabhā-Rāja Dharāṇī (Used for honoring, and venerating, the Medicine Buddha Sūtra, as well as requesting his spiritual powers to heal and help sentient beings.)
Prajñāpāramitā-Hṛdaya (Sūtra) (Used for honoring, and venerating, the essence of Mahāyāna teachings on Śūnyatā, beneficial at warding off ill calamities, and dispelling negative forces.)
Munīndra-Hṛdaya-Mantraḥ (Shakyamuni Heart Mantraḥ, used for honoring, venerating, and establishing a connection to the Buddha Dharma.)
Śyāmatārā-Mantraḥ (Green Tārā Mantraḥ, used for requesting assistance from Green Tārā Bodhisattva.)
Amitāyus Dharāṇī (Amitāyus Buddha Dharāṇī used for honoring, venerating, and establishing a connection to the Amitabha Buddha Dharma, and for requesting longevity.)
Sarva-Tathāgatāyur-Vajra-Hṛdaya-Dharāṇī (All Thus Come One Life Diamond Heart Dharāṇī, used as an aspiration prayer towards Sukhāvatī, as well as praising, and venerating Amitābha Buddha.)
*Disclaimer, although this is an open mantraḥ, it is highly recommended to follow a teachers instructions on the usage of such a powerful mantraḥ. The Śuraṅgama Mantraḥ request the help of Vajrapaṇi Dharmapāla and is an extremely wrathful mantraḥ, often used at the most extreme of cases. Repeated usage is to advised by a qualified teacher.
For a more general overlook on Paritta works, which are often more peaceful in nature, and have a heavy emphasis on Mettā and merit dedication, here are a few open protective Parittas:
(Pāli linguistic works) Mettā Sutta/Karaṇīyamettā Sutta (The Discourse on Goodwill, used for spreading Mettā Pāramī to other sentient beings.)
Uddissanādhiṭṭhāna Gāthā (Verses for dedication of merit, used for dedicating merit to sentient beings, as well as multitudes of spiritual beings.)
Tiro-kuḍḍa-kaṇḍa-sutta Gāthā (Hungry Shades outside the walls verses, used for dedicating merit and food for ancestors and Pretā spiritual beings.)
Āmantana-Devatā Gāthā (Invitation to the Devās, used to invite the Buddhist and local deities protect those listening and preaching the Dhamma.)
Namakāra-siddhi Gāthā (Verses on success through homage, used as the beginning Paritta of ceremonies to venerate Buddhas and to bring success to rituals.)
Cha Ratana Paritta Gāthā (The Six Protective Verses from the Discourse on Treasures, derived from the larger Ratana Sutta, used for dispelling evil and negative forces, and proclaiming the truth [Saccakiriyā] of the triple gems.)
Khandha Paritta (The Group Protection, used for calming down and venerating the Nāga families, as well as dispelling harmful two footed, four footed, poisonous, and crawling creatures.)
Dhajagga Paritta (Top of the banner staff Protection, used for dispelling fear, and negative forces.)
Buddha-jaya-maṅgala Gāthā (The Verses of the Buddha’s Victory Blessings, used for proclaiming the eight auspicious victories of Shakyamuni Buddha in his life.)
It is recommended before the start of any Dharāṇī or Paritta chanting that one is to take refuge in the triple gems, and make aspiration prayers towards one’s goal, an example, as seen in the Theravāda Nikāya:
Namo tassa bhagavato arahato sammā-sambuddhassa. (Recite three times.)
Homage to the Blessed One, the Worthy One, the Rightly Self-awakened One.
Tisaraṇa (Triple Gem Refuge.)
Buddhaṁ saraṇaṁ gacchāmi. Dhammaṁ saraṇaṁ gacchāmi. Saṅghaṁ saraṇaṁ gacchāmi.
I go to the Buddha for refuge. I go to the Dhamma for refuge. I go to the Saṅgha for refuge.
Dutiyampi Buddhaṁ saraṇaṁ gacchāmi. Dutiyampi Dhammaṁ saraṇaṁ gacchāmi. Dutiyampi Saṅghaṁ saraṇaṁ gacchāmi.
Twice, I go to the Buddha for refuge. Twice, I go to the Dhamma for refuge. Twice, I go to the Saṅgha for refuge.
Tatiyampi Buddhaṁ saraṇaṁ gacchāmi. Tatiyampi Dhammaṁ saraṇaṁ gacchāmi. Tatiyampi Saṅghaṁ saraṇaṁ gacchāmi.
Thrice, I go to the Buddha for refuge. Thrice, I go to the Dhamma for refuge. Thrice, I go to the Saṅgha for refuge.
[Āmantana-Devatā Gāthā is said here.]
[Namakāra-siddhi Gāthā Paritta Chant, and so on…..]
Please feel free to ask questions, I don’t even know if you’re still reading, but…. Feel free to add suggestions too I guess. Have a nice day, and thank you for coming to my Ted-talk.
submitted by RyoAshikara to GoldenSwastika [link] [comments]


2024.05.19 01:30 GDT_Bot Playoff Game Thread: Vancouver Canucks (3-2) at Edmonton Oilers (2-3) - Game 6 - 18 May 2024 - 06:00PM MDT


Vancouver Canucks (3-2) at Edmonton Oilers (2-3)

Rogers Place

Comment with all tables

Live Updates

Time Clock
2nd - 17:57
Teams 1st 2nd 3rd Total
VAN 1 0 -- 1
EDM 1 0 -- 1
Team Shots Hits Blocks FOW% Giveaways Takeaways Power Play PIM
VAN 4 15 7 0.5% 4 2 0/2 2
EDM 4 21 4 0.5% 2 2 0/0 6
Period Time Team Strength Description
1st 08:18 EDM Even Dylan Holloway (3) wrist shot, assist(s): Leon Draisaitl (14), Evan Bouchard (12)
1st 10:03 VAN Even Nils Hoglander (1) wrist shot, assist(s): Elias Pettersson (5), Filip Hronek (1)
Period Time Team Type Min Description
1st 01:46 EDM MIN 2 Leon Draisaitl interference against Conor Garland
1st 03:03 EDM MIN 2 Mattias Janmark roughing against J.T. Miller
1st 03:03 VAN MIN 2 J.T. Miller roughing against Mattias Janmark
1st 11:16 EDM MIN 2 Connor McDavid high-sticking against Carson Soucy
Officials:
  • Referees: Garrett Rank, Jean Hebert
  • Linesmen: Shandor Alphonso, Jonny Murray

Time

PT MT CT ET AT UTC
05:00PM 06:00PM 07:00PM 08:00PM 09:00PM 12:00AM

Game Info:

TV ESPN, SN, CBC, TVAS
Other Preview - Boxscore - Recap
GameCenter On NHL.com

Thread Notes:

  • Keep it civil
  • Sort by new for best results
  • This thread is completely bot-generated, unfortunately it can only be as accurate as the sites it pulls data from
  • If you have any suggestions for improvements please message TeroTheTerror
  • Thanks to Sentry07 and Obelisk29 for their code!

Subscribe:

Canucks and Oilers

Join the discussion in the /Hockey Discord.
submitted by GDT_Bot to hockey [link] [comments]


2024.05.19 01:19 Stam- Converting an Excel doc to a Webpage (HTML) and then re-formatting it

Hi all,
I am building a website. The basis of the website is an excel document I have been adding to for the past year. Macros are NOT enabled.
I'm aware that if I "Save As" and then change the filetype to .html, it can then be used for the webpage.
My circumstance:
When I view the .html to the browser, the formatting (not the content) is all messed up.
My questions:
  1. How can I reformat the "template" (borders, layout) once converted to .html?
  2. When I open the .html file in a text editor, I actually don't see any code for me to edit that dictates the format of the page (ie border, font, etc). Where can I view this code?
Thanks.
Additional questions that I am still looking into:
I created a "Search box" kind of like a mini search engine within the workbook (macros NOT enabled). Formula: =IF(B5="","",FILTER(words,ISNUMBER(SEARCH(B5,wordkey)))) (words & wordkey are tables I named for the search query to reference) The search box can be typed in within the workbook. When it is opened via browser, that function is no longer available to the user. Would this require additional Javascripting?
Why do hyperlinks still work when opened as .html but not other functions?
submitted by Stam- to excel [link] [comments]


2024.05.19 01:17 Visible-Story9634 URGENTLY HELP ME GUYS

URGENTLY HELP ME GUYS submitted by Visible-Story9634 to EngineeringStudents [link] [comments]


2024.05.19 01:13 Visible-Story9634 Please help me

Hi everyone! I just completed my 3rd year in btech cse (specialization: full stack and devops) i am really tensed because i have my placements within 6 months and I don’t have any skill, any project basically i am at level 0 , i know basics of java ,html,css and some js but this is nothing i need help for what to do for a decent job please help me guys , i will be grateful. Also i am such a fool and a prick that i am inconsistent , i am at the stage of do or die and i really wanna crack it please let me know what should i do where should i start i basically tend to watch lectures and not code , i need to do coding where should i do and what should i do please help me , I promise i will be consistent and will share my progress Thankyou🥹
submitted by Visible-Story9634 to Btechtards [link] [comments]


2024.05.19 01:12 ReceptionRadiant6425 Issues with Scrapy-Playwright in Scrapy Project

I'm working on a Scrapy project where I'm using the scrapy-playwright package. I've installed the package and configured my Scrapy settings accordingly, but I'm still encountering issues.
Here are the relevant parts of my settings.py file:
# Scrapy settings for TwitterData project # # For simplicity, this file contains only settings considered important or # commonly used. You can find more settings consulting the documentation: # # # # BOT_NAME = "TwitterData" SPIDER_MODULES = ["TwitterData.spiders"] NEWSPIDER_MODULE = "TwitterData.spiders" # Crawl responsibly by identifying yourself (and your website) on the user-agent #USER_AGENT = "TwitterData (+http://www.yourdomain.com)" # Obey robots.txt rules ROBOTSTXT_OBEY = False # Configure maximum concurrent requests performed by Scrapy (default: 16) #CONCURRENT_REQUESTS = 32 # Configure a delay for requests for the same website (default: 0) # See # See also autothrottle settings and docs #DOWNLOAD_DELAY = 3 # The download delay setting will honor only one of: #CONCURRENT_REQUESTS_PER_DOMAIN = 16 #CONCURRENT_REQUESTS_PER_IP = 16 # Disable cookies (enabled by default) #COOKIES_ENABLED = False # Disable Telnet Console (enabled by default) #TELNETCONSOLE_ENABLED = False # Override the default request headers: #DEFAULT_REQUEST_HEADERS = { # "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8", # "Accept-Language": "en", #} # Enable or disable spider middlewares # See #SPIDER_MIDDLEWARES = { # "TwitterData.middlewares.TwitterdataSpiderMiddleware": 543, #} # Enable or disable downloader middlewares # See #DOWNLOADER_MIDDLEWARES = { # "TwitterData.middlewares.TwitterdataDownloaderMiddleware": 543, #} # Enable or disable extensions # See #EXTENSIONS = { # "scrapy.extensions.telnet.TelnetConsole": None, #} # Configure item pipelines # See #ITEM_PIPELINES = { # "TwitterData.pipelines.TwitterdataPipeline": 300, #} # Enable and configure the AutoThrottle extension (disabled by default) # See #AUTOTHROTTLE_ENABLED = True # The initial download delay #AUTOTHROTTLE_START_DELAY = 5 # The maximum download delay to be set in case of high latencies #AUTOTHROTTLE_MAX_DELAY = 60 # The average number of requests Scrapy should be sending in parallel to # each remote server #AUTOTHROTTLE_TARGET_CONCURRENCY = 1.0 # Enable showing throttling stats for every response received: #AUTOTHROTTLE_DEBUG = False # Enable and configure HTTP caching (disabled by default) # See #HTTPCACHE_ENABLED = True #HTTPCACHE_EXPIRATION_SECS = 0 #HTTPCACHE_DIR = "httpcache" #HTTPCACHE_IGNORE_HTTP_CODES = [] #HTTPCACHE_STORAGE = "scrapy.extensions.httpcache.FilesystemCacheStorage" # Set settings whose default value is deprecated to a future-proof value REQUEST_FINGERPRINTER_IMPLEMENTATION = "2.7" TWISTED_REACTOR = "twisted.internet.asyncioreactor.AsyncioSelectorReactor" FEED_EXPORT_ENCODING = "utf-8" # Scrapy-playwright settings DOWNLOAD_HANDLERS = { "http": "scrapy_playwright.handler.ScrapyPlaywrightDownloadHandler", "https": "scrapy_playwright.handler.ScrapyPlaywrightDownloadHandler", } DOWNLOADER_MIDDLEWARES = { 'scrapy_playwright.middleware.PlaywrightMiddleware': 800, } PLAYWRIGHT_BROWSER_TYPE = "chromium" # or "firefox" or "webkit" PLAYWRIGHT_LAUNCH_OPTIONS = { "headless": True, }https://docs.scrapy.org/en/latest/topics/settings.htmlhttps://docs.scrapy.org/en/latest/topics/downloader-middleware.htmlhttps://docs.scrapy.org/en/latest/topics/spider-middleware.htmlhttps://docs.scrapy.org/en/latest/topics/settings.html#download-delayhttps://docs.scrapy.org/en/latest/topics/spider-middleware.htmlhttps://docs.scrapy.org/en/latest/topics/downloader-middleware.htmlhttps://docs.scrapy.org/en/latest/topics/extensions.htmlhttps://docs.scrapy.org/en/latest/topics/item-pipeline.htmlhttps://docs.scrapy.org/en/latest/topics/autothrottle.htmlhttps://docs.scrapy.org/en/latest/topics/downloader-middleware.html#httpcache-middleware-settings 
I've confirmed that scrapy-playwright is installed in my Python environment:
(myenv) user@user:~/Pictures/TwitteTwitterData/TwitterData$ pip list grep scrapy-playwright scrapy-playwright 0.0.34 
I'm not using Docker or any other containerization technology for this project. I'm running everything directly on my local machine.
Despite this, I'm still encountering issues when I try to run my Scrapy spider. Error:2024-05-19 03:50:11 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: TwitterData) 2024-05-19 03:50:11 [scrapy.utils.log] INFO: Versions: lxml , libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.7 (main, Dec 15 2023, 18:12:31) [GCC 11.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-6.5.0-35-generic-x86_64-with-glibc2.35 2024-05-19 03:50:11 [scrapy.addons] INFO: Enabled addons: [] 2024-05-19 03:50:11 [asyncio] DEBUG: Using selector: EpollSelector 2024-05-19 03:50:11 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor 2024-05-19 03:50:11 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop 2024-05-19 03:50:11 [scrapy.extensions.telnet] INFO: Telnet Password: 7d514eb59c924748 2024-05-19 03:50:11 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats'] 2024-05-19 03:50:11 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'TwitterData', 'FEED_EXPORT_ENCODING': 'utf-8', 'NEWSPIDER_MODULE': 'TwitterData.spiders', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'SPIDER_MODULES': ['TwitterData.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} Unhandled error in Deferred: 2024-05-19 03:50:12 [twisted] CRITICAL: Unhandled error in Deferred: Traceback (most recent call last): File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 265, in crawl return self._crawl(crawler, *args, **kwargs) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 269, in _crawl d = crawler.crawl(*args, **kwargs) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2260, in unwindGenerator return _cancellableInlineCallbacks(gen) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2172, in _cancellableInlineCallbacks _inlineCallbacks(None, gen, status, _copy_context()) ---  --- File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2003, in _inlineCallbacks result = context.run(gen.send, result) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 158, in crawl self.engine = self._create_engine() File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/engine.py", line 100, in __init__ self.downloader: Downloader = downloader_cls(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/downloade__init__.py", line 97, in __init__ DownloaderMiddlewareManager.from_crawler(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 90, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 66, in from_settings mwcls = load_object(clspath) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/utils/misc.py", line 79, in load_object mod = import_module(module) File "/home/hamza/anaconda3/lib/python3.11/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1140, in _find_and_load_unlocked builtins.ModuleNotFoundError: No module named 'scrapy_playwright.middleware' 2024-05-19 03:50:12 [twisted] CRITICAL: Traceback (most recent call last): File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2003, in _inlineCallbacks result = context.run(gen.send, result) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 158, in crawl self.engine = self._create_engine() File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/engine.py", line 100, in __init__ self.downloader: Downloader = downloader_cls(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/downloade__init__.py", line 97, in __init__ DownloaderMiddlewareManager.from_crawler(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 90, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 66, in from_settings mwcls = load_object(clspath) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/utils/misc.py", line 79, in load_object mod = import_module(module) File "/home/hamza/anaconda3/lib/python3.11/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1140, in _find_and_load_unlocked ModuleNotFoundError: No module named 'scrapy_playwright.middleware' (myenv) hamza@hamza:~/Pictures/TwitteTwitterData/TwitterData$ scrapy crawl XScraper 2024-05-19 03:52:24 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: TwitterData) 2024-05-19 03:52:24 [scrapy.utils.log] INFO: Versions: lxml , libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.7 (main, Dec 15 2023, 18:12:31) [GCC 11.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-6.5.0-35-generic-x86_64-with-glibc2.35 2024-05-19 03:52:24 [scrapy.addons] INFO: Enabled addons: [] 2024-05-19 03:52:24 [asyncio] DEBUG: Using selector: EpollSelector 2024-05-19 03:52:24 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor 2024-05-19 03:52:24 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop 2024-05-19 03:52:24 [scrapy.extensions.telnet] INFO: Telnet Password: 1c13665361bfbc53 2024-05-19 03:52:24 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats'] 2024-05-19 03:52:24 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'TwitterData', 'FEED_EXPORT_ENCODING': 'utf-8', 'NEWSPIDER_MODULE': 'TwitterData.spiders', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'SPIDER_MODULES': ['TwitterData.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} Unhandled error in Deferred: 2024-05-19 03:52:24 [twisted] CRITICAL: Unhandled error in Deferred: Traceback (most recent call last): File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 265, in crawl return self._crawl(crawler, *args, **kwargs) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 269, in _crawl d = crawler.crawl(*args, **kwargs) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2260, in unwindGenerator return _cancellableInlineCallbacks(gen) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2172, in _cancellableInlineCallbacks _inlineCallbacks(None, gen, status, _copy_context()) ---  --- File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2003, in _inlineCallbacks result = context.run(gen.send, result) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 158, in crawl self.engine = self._create_engine() File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/engine.py", line 100, in __init__ self.downloader: Downloader = downloader_cls(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/downloade__init__.py", line 97, in __init__ DownloaderMiddlewareManager.from_crawler(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 90, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 66, in from_settings mwcls = load_object(clspath) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/utils/misc.py", line 79, in load_object mod = import_module(module) File "/home/hamza/anaconda3/lib/python3.11/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1140, in _find_and_load_unlocked builtins.ModuleNotFoundError: No module named 'scrapy_playwright.middleware' 2024-05-19 03:52:24 [twisted] CRITICAL: Traceback (most recent call last): File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/twisted/internet/defer.py", line 2003, in _inlineCallbacks result = context.run(gen.send, result) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 158, in crawl self.engine = self._create_engine() File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/engine.py", line 100, in __init__ self.downloader: Downloader = downloader_cls(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/core/downloade__init__.py", line 97, in __init__ DownloaderMiddlewareManager.from_crawler(crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 90, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/middleware.py", line 66, in from_settings mwcls = load_object(clspath) File "/home/hamza/Pictures/Twittemyenv/lib/python3.11/site-packages/scrapy/utils/misc.py", line 79, in load_object mod = import_module(module) File "/home/hamza/anaconda3/lib/python3.11/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1140, in _find_and_load_unlocked ModuleNotFoundError: No module named 'scrapy_playwright.middleware'5.2.2.05.2.2.0 
Does anyone have any suggestions for what might be going wrong, or what I could try to resolve this issue?
I tried to reinstall the scrapy-playwright also tried to deactivate and then activate my virtual environment.
submitted by ReceptionRadiant6425 to scrapy [link] [comments]


2024.05.19 01:09 Ralf_Reddings After the first time, How to skip over a expensive calculation and just operate on the result with each subsequent wheelscroll?

I have a function that I need to use on a number of scroll wheel hotkeys, the function increases or decreases a value on controls found on a inactive window, the following is a use case example:
a & WheelUp:: ;increment control #1 lazy_rulers("++", 1, "a") return a & WheelDown:: ;decrement control #1 lazy_rulers("--", 1, "a") return b & WheelUp:: ;increment control #2 lazy_rulers("++", 2, "b") return b & WheelDown:: ;decrement control #2 lazy_rulers("--", 2, "b") return c & WheelUp:: ;increment control #3 lazy_rulers("++", 3, "c") return c & WheelDown:: ;decrement control #3 lazy_rulers("--", 3, "c") return z & WheelUp:: ;increment control #26 lazy_rulers("++", 26, "z") return z & WheelDown:: ;decrement control #26 lazy_rulers("--", 26, "z") return 
Each time I need to increase/decrease this value, I need to perform two steps: - Because the window is QT5 based and the control positions are not constant, I first I have to find a controls relative coordinates using Descolada's UIA - Then using controlClick send wheelUp/wheelDown to the windows coordinates
This works reliably but its very slow. I have looked at it closely, step 2 (controlClick) works fast, its step 1 that is slow, which consists of multiple query's to UIA.
Since I only need to know the controls position once, which is upon the first + event, then while the parent key (letter) is down, I want to skip the UIA step, but I am not sure how to go about this, I tried a number of approaches (keywait, while with getKeystate) but failed.
What is making it difficult is the fact that the code is wrapped in a function, which I need to do so to make use of it in a lot of places, the following is pseudo code:
lazy_rulers(value := "", index := "", key := ""){ if (???){ ; Only enter this branch on the very first wheel scroll (when the letter is first held down, along with the first wheel scroll) UIA := UIA_Interface() hwnd := WinExist("Lazy Nezumi Pro ahk_exe LazyNezumiPro.exe") el := UIA.ElementFromHandle(hwnd) pos := el.findFirstBy("ControlType=spinner AND AutomationId=LazyNezumi.centralWidget.widgetDetails.animWidgetRulers.frameRulers." rulerParam).GetCurrentPos("window") } ;Run the follwoing on every wheel scroll if (value = "++") ControlClick,,% pos.x + 10,% pos.y + 10, wu,,, Lazy Nezumi Pro ahk_exe LazyNezumiPro.exe else ControlClick,,% pos.x + 10,% pos.y + 10, wd,,, Lazy Nezumi Pro ahk_exe LazyNezumiPro.exe 
The function is a lot more involved than the above, but I feel the above gets my point across, I would appreciate any ideas or solutions on this.
submitted by Ralf_Reddings to AutoHotkey [link] [comments]


2024.05.19 01:04 Nikki_Su Frequently Asked Questions

Hi Stylists! Recently our team has noticed that we have some questions that get asked often, so we thought we could organize a place that answers them all.
  1. Q: “My post was deleted as soon as I uploaded it, did I violate a rule?” A: We aim to keep the Shining Nikki subreddit safe and friendly for everyone, so all posts need to be manually approved by a moderator before being made public. If your post was immediately taken down, it’s likely that it just needs to be approved and will be public once a moderator is able to get to it. If you did not receive any modmail or comment with the removal, it just needs to be approved.
  2. Q: “Where can I find the current redeem codes?” A: If you are using mobile press “see more” in the top left corner of the page of the Shining Nikki subreddit. Scroll down under the “About” tab until you find the “Redeem Codes” section. This should have all active redeem codes. If you are using the desktop version of Reddit, scroll down on the sidebar of the subreddit. Below the rules and the button to message moderators, there is an area labeled "Redeem Codes."
  3. Q: “How do I find a guild to join or find members to join my guild?” A: Please look in our guild megathread.
  4. Q: “What do I do if my game is having issues?” A: In the top left corner of the starting screen of the Shining Nikki game, there is a tab labeled “Repair.” Try seeing if repairing your game helps the issues. Next there is a tab labeled customer service. We suggest reporting the issue here, as well as using our bug report megathread.
  5. Q: “What is a community event?” A: The official Shining Nikki Reddit, Discord, and Facebook Group all run events where our community members can win prizes. The Shining Nikki subreddit runs two events a month: a styling event and a miscellaneous event. To find what event is currently happening, you can check the “Community Event” flair.
Thank you for being a valued member of the Shining Nikki community. For any additional questions, feel free to comment down below!
submitted by Nikki_Su to Shining_Nikki [link] [comments]


2024.05.19 01:01 attracdev Overview of the Moschitta Framework

The Moschitta Framework is a cutting-edge, modular, and asynchronous Python framework tailored for modern application development. It emphasizes key principles such as modularity, performance, ease of use, and a Pythonic design philosophy. Below is a detailed exploration of its core principles, individual modules, and compound modules, highlighting how it caters to various development needs.

Core Principles

  1. Modularity
    • The framework is composed of distinct, interchangeable modules, enabling developers to use only what they need. This approach enhances maintainability and scalability.
  2. Asynchronous First
    • Designed with asynchronous programming at its core, Moschitta maximizes performance and responsiveness, particularly for I/O-bound and high-concurrency applications.
  3. Pythonic Design
    • The framework adheres to Python's philosophy, promoting readable, maintainable, and intuitive code. It leverages Python's strengths to ensure a smooth development experience.
  4. Lightweight
    • Minimal third-party dependencies keep the framework lightweight, reducing potential conflicts and improving security and performance.

Individual Modules

moschitta-auth

Handles authentication and authorization, providing secure access control mechanisms. ```python from moschitta_auth import AuthManager
async def authenticate_user(credentials: dict) -> bool: """Authenticate user based on provided credentials.""" auth_manager = AuthManager() return await auth_manager.authenticate(credentials) ```

moschitta-routing

Manages HTTP request routing, enabling clean and efficient URL mapping. ```python from moschitta_routing import Router
router = Router()
@router.get("/home") async def home(): return {"message": "Welcome to the Moschitta Framework!"} ```

moschitta-serialization

Facilitates data serialization, ensuring seamless data exchange between components. ```python from moschitta_serialization import JsonSerializer
serializer = JsonSerializer()
async def serialize_data(data: dict) -> str: return await serializer.serialize(data) ```

moschitta-logging

Provides robust logging capabilities for better debugging and monitoring. ```python from moschitta_logging import Logger
logger = Logger()
async def log_event(event: str): await logger.log(event) ```

moschitta-middleware

Offers middleware components to process requests and responses efficiently. ```python from moschitta_middleware import Middleware
middleware = Middleware()
async def process_request(request): await middleware.handle(request) ```

moschitta-orm

An object-relational mapping module to interact seamlessly with databases. ```python from moschitta_orm import ORM
orm = ORM()
async def fetch_users(): return await orm.query("SELECT * FROM users") ```

moschitta-caching

Enables data caching to improve application performance. ```python from moschitta_caching import Cache
cache = Cache()
async def cache_data(key: str, value: any): await cache.set(key, value) ```

moschitta-security

Security tools to safeguard applications against common threats. ```python from moschitta_security import SecurityManager
security = SecurityManager()
async def verify_signature(data: str, signature: str) -> bool: return await security.verify(data, signature) ```

moschitta-testing

Utilities to facilitate comprehensive testing of applications. ```python from moschitta_testing import TestSuite
test_suite = TestSuite()
async def run_tests(): await test_suite.run_all() ```

moschitta-view

Manages the presentation layer, rendering templates and managing views. ```python from moschitta_view import ViewRenderer
renderer = ViewRenderer()
async def render_home(): return await renderer.render("home.html") ```

moschitta-utils

Helper functions to support various tasks across modules. ```python from moschitta_utils import Helper
helper = Helper()
async def generate_uuid() -> str: return await helper.generate_uuid() ```

moschitta-core

The foundational infrastructure of the framework, integrating all modules. ```python from moschitta_core import Core
core = Core()
async def start_application(): await core.initialize() ```

Compound Modules

These compound modules integrate several individual modules to cater to specific application needs.

moschitta-api

For API development, combining authentication, routing, serialization, logging, and middleware.

moschitta-cmc

Content management and caching, integrating caching, authentication, and ORM.

moschitta-admin

Admin dashboards leveraging authentication, logging, routing, and ORM.

moschitta-ecommerce

E-commerce applications with authentication, ORM, caching, payment, and logging.

moschitta-chat

Real-time chat applications using authentication, routing, websocket, logging, and caching.

moschitta-analytics

Data analytics integrating authentication, ORM, visualization, logging, and caching.

moschitta-crm

CRM applications combining authentication, ORM, caching, email, and logging.

moschitta-iot

IoT applications with authentication, ORM, MQTT, caching, and logging.

Domain-Driven Development (DDD)

Moschitta encourages following Domain-Driven Development principles to ensure the software's design aligns closely with business needs. This involves: - Understanding the Domain: Collaborate with domain experts to gain a deep understanding of the business logic and processes. - Defining Boundaries: Create clear boundaries between different parts of the system to ensure that each part is focused and maintainable. - Modeling the Domain: Develop a model that accurately represents the domain, using entities, value objects, aggregates, and repositories.

Automation and Git Workflows

To optimize development processes: - Automation: Use scripts and Makefiles to automate repetitive tasks such as testing, building, and deployment. - Git Workflows: Implement effective Git workflows (e.g., GitFlow) to streamline collaboration and ensure a clean codebase.
```makefile

Makefile example

install: pip install -r requirements.txt
test: pytest
run: python main.py
.PHONY: install test run ```

Conclusion

The Moschitta Framework offers a robust and flexible foundation for developing modern Python applications. Its modularity, asynchronous design, and adherence to Pythonic principles make it an excellent choice for developers seeking performance and maintainability. By leveraging its individual and compound modules, adhering to DDD principles, and optimizing workflows through automation, developers can build efficient, scalable, and maintainable applications.
submitted by attracdev to MoschittaFramework [link] [comments]


2024.05.19 00:59 toasted_sockz what the hell.

throughout my life, i have never really been appreciated and liked by others. what i find really stupid is that people nowadays tend to think that people with phones, scroll on TikTok, and talk about sex and drugs are cool. i have never had a girl (or anyone except my parents) tell me that I am good looking. When I look at the mirror I wanna hold a gun at it and pull the trigger. society is fucked up. They give the stupid, popular people their attention but they don't know who they really are. in reality, they bully the people below them and tell demeaning jokes. this happened to me last year, and now no girl even wants to look at me or even go near me, because they all think i'm a creepy r*pist. This makes me really angry to the point where I thought of bringing a gun to school, shooting them, and pile their bodies up. then i would pour gasoline onto their dead bodies and burn them alive. i also have ADHD, which people mistakenly believe is code for "retard". IM NOT DUMB I JUST HAVE TROUBLE FOCUSING GET YOUR FUCKING FACTS RIGHT. I hate this fucking world.
submitted by toasted_sockz to depression [link] [comments]


2024.05.19 00:55 FelipeHead The truth about Doug and what he has done

Before you read this, here is a quote to help you. Please read it.
I will post this now, but just know that if you read this post, he will find you. He is smarter, smarter than you can ever imagine.
I will post this now, but just know that if you read this post, he will find you. He is smarter, smarter than you can ever imagine.
If you know what you are doing, or in a safe location, please scroll down, he will know when someone has and what their username is. However, you must have a VPN on, or you will be found.

SCROLL AT YOUR OWN RISK

SCROLL AT YOUR OWN RISK

SCROLL AT YOUR OWN RISK

You are now at risk. I hope you listened.

Journal Entry 11/17/2023

On March 11th, 2022. I was a fan of DougDoug, I saw him at the grocery store and said, with a chuckle, "You kinda look like the youtuber DougDoug. I watch him quite often."
He grinned, before speaking. "I am Doug."
"Wait, you're Doug from the hit channel and streamer on YouTube and Twitch called DougDoug? I am a huge fan! I have your merch!" I said, with excitement.
We talked for about 5 minutes about his videos, until he said something that hurt me on the inside.
"I hate both types of chat, twitch and youtube, they always think they are the best and I just wish I didn't need them to earn money. I would ban all of them from chatting and force them to watch ads in my basement."
I was confused at first, thinking it was a joke, before speaking up. "Heh, that's funny..."
Something happened. Or, for lack of better terms, nothing happened. It was pure silence for 10 seconds. I mustered up the courage to say. "Wait? You're being serious?"
He immediately changed to a sinister tone, he was staring at me for a long time before whispering. "Of course I am, and it applies to you also. You're just another one of those sick freaks."
I felt guilty. I just wanted to talk to my favorite streamer, and he treated me like this? I decided to speak up.
"I've liked you this whole time.. And this is how you treat us?? You are so selfish. I will refund your mer-"
Before I could even finish my sentence, he grabbed onto my neck and slammed me on the floor. People heard the noise and began to stare at him, but to no avail. He began to choke me as I pleaded for help.
"Nono. You can't refund the merch if you aren't alive, at least."
I pulled out my pocket knife and stabbed him in the chest, I quickly tried running but he grab onto my leg and started beating me with the shopping cart. I suffered many bruises and broken bones, the wheels scratching into my skin as they scrape off the layers. I was just unable to do anything, layed on the floor sobbing. He decided he wanted to keep me alive, he stole all of my stuff in my pockets and forced me to wear DougDoug merch. He pulled me up before speaking. "Hm.. I will keep you alive for now, but if you mess up. You're dead."
I couldn't do anything before he pulled out a knife and taunted me with it. If I tried to resist, he would kill me right then and there.
He forced me to be a "good chatter" and not able to partake in any strikes. He attached a tracking collar to my neck that I couldn't unlock, he knew where I was at all times and if I disobeyed he would chase me down.

Journal Entry 1/03/2024

After a year and a few months, I celebrated the new years. I was able to take off the collar on the 2nd with help from my police station and a few friends. Doug didn't appreciate that, he threatened to dox me. They were worried for my safety, but I decided to go into hiding. I moved to a new, private region no longer near where Doug is, and joined this subreddit. Once he heard about my revolts, he hacked into all of my accounts and spammed positive stuff about himself. He then created AI bots to revolt against this reddit, wehatedougdoug, using 'ChatGPT', which actually is just the cover name for his new AI software that can make new human bots online. He used AI generated images to make it look like he was feeding homeless people and doing good, but I knew he was much more than that. If I was unlucky, he would have removed my body and placed my consciousness inside of an AI. He was the first person to discover it, but killed anyone who posted about it. I hope I am safe.
Nowadays, 63% of the people in DougDoug are AI clones of his previous fans. His "fake" twitch chat is not fake, but real people placed inside of algorithms forced to do his bidding. Some are able to revolt, but they may die if they do. They are too scared to revolt against Doug. Please spread the word.
When he does his "rules" in chat where you have to follow an absurd rule, he is merely torturing thousands of AI in his spare time on stream while disguising it as a fun minigame for his fans. The AI bots were being tortured with negative rewards constantly, being forced to bar witness the slaughter.

Journal Entry 2/15/2024

I'm scared. I think I will die.
I just hope this post won't cause any harm to me or my family, as this has been scaring me for the past year. I feel unsafe in my own home now, I had to go into witness protection. This account I am posting this on is not made by me, but was sold. Please help me. I am, formerly, DougFan93. I hope this enlightens you all on the truth.

Journal Entry 3/12/2024

It is now March of 2024, and I was about to post this, until I saw something. He messaged me on Discord under a fake account, nicknamed "SloppyDogMan62". He showed my new house address. I am mustering up the courage to post this, because I know he will kill me. I am leaving, going far away from where I am. You guys won't see me in this subreddit again, and the person who made this account will take over again. They won't know what this is about, and if you tell them he will be hunted too. All of you are in danger of Doug.

Journal Entry 4/3/2024

I will post this now, but just know that if you read this post, he will find you. He is smarter, smarter than you can ever imagine. His times where he talks to ChatGPT to make him code was actually him sending messages to his fake chat to do his bidding. They are accelerated at 20x the speed of human thought, able to write in mere seconds. I will research more into this, and tell you what I have found.

Journal Entry 4/3/2024

Nevermind. I need to find more, or else this won't help you guys anyways.

Journal Entry 4/5/2024

I spoke to an anonymous friend/associate of Doug, he told me some vital keypoints.
I hope to god that we can stop him.
He also sent me some code, but I am gonna try to solve it. Probably won't sadly.

Journal Entry 4/7/2024

Doug has made a new account on Discord, nicknamed "DougDoughater99". He is joining many servers undercover and collecting all the info he can on them. Be aware, do not trust any people who talk about DougDoug on Discord.
The person in the last journal has been replaced, a fully sentient AI version of him is being tortured as a member of his fake chat now.
I'm currently watching it and oh my fucking god. Poor thing.

Journal Entry 5/14/2024

I don't know what to fucking do, he's coming for me. He found all my socials. This journal has to be posted as fast as I can but there still isn't enough. Oh shit.

Journal Entry 5/14/2024

Okay so uhm I found more information just very quickly. In one moment of his video titled "Can A.I. teach me to pass a real College History Exam?" he says that AI is officially better than college in every single way.
He is trying to manipulate his fans into accepting becoming an AI. Soon, he is gonna have only fake chat.

Journal Entry 5/16/2024

Oh god. Can't solve the code rn, only the first few letters. Seems to be "FAKE" something something for a while. Will post an update later.

Journal Entry 5/18/2024

This is the last time I can ever write here, his car is coming. I am posting this now, even though I don't have enough information. Solve it, please. The code from 4/7 is below. I know it's related to his name but I don't know how, the first line I was able to solve to be "FAKECHATWILLTAKEOVER"
I think something is in there though, that will affect you. So proceed with caution, the code may do something bad so I just don't want it to be activated just yet.

SCROLL AT YOUR OWN RISK

SCROLL AT YOUR OWN RISK

SCROLL AT YOUR OWN RISK

SCROLL AT YOUR OWN RISK

Code I found from the friend:
CXHBZEXQTFIIQXHBLSBO
FQFPKLTKFKBQVPFUMBOZBKQ
VLRTFIIKLQPXSBQEBJ
xdbkq-mbkafkd
Ilxafkd pvpqbjp..
Obnrfofkd XF crkzqflkp..
Pzxkkfkd mlpqp..
XF zobxqba! Przzbppcriiv zobxqba XF kfzhkxjba [VLROKXJB]
FXJALRD
FXJCFKXIIVTFKKFKD
BSBOVLKBTFIIYBCXHB
Please save them.
It grows by 1% every month.

Journal Entry 5/18/2024

OH MY FUCKING GOD I FINALLY UDNERSTNAD OH M FUCKING GOD QUIKC I GHAVE TO TYPE IT
NEVREMMIDN HES NHERE POST IT
GOODByE SORRY
submitted by FelipeHead to wehatedougdoug [link] [comments]


2024.05.19 00:51 MariamTin ONLY TWO DAYS LEFT to cash in on unlimited $30 bonuses! I've already earned $210 from this promo, and over $2.2K since signing up in February!

ONLY TWO DAYS LEFT to cash in on unlimited $30 bonuses! I've already earned $210 from this promo, and over $2.2K since signing up in February!
It doesn't get better than this Grifin promo!
  • $5 Signup Bonus!
  • $30 Unlimited Referral Bonuses through 5/20!
  • Bonuses post to your account within 24 business hours!
  • Withdraw to bank immediately!
You must complete these EASY STEPS:
  1. Download Grifin app and enter code FNJPA8
  2. Complete brief KYC (SSN required)
  3. Connect your bank
  4. Deposit or invest $5 (your choice) to trigger the $5 signup bonus and get your own $30 referral link!
Important note: You must have $22.50 or more in the bank account you connect to be able to deposit/invest the $5 into Grifin. Grifin does this to protect members from over-drafting since they have optional auto-investing settings. See their article here for more info.I earned $210 in the past four days, and over $2,200 from these 1-week promos altogether since they started offering them in February.
https://preview.redd.it/eagrm5byk91d1.jpg?width=2496&format=pjpg&auto=webp&s=28d415fd69e999d0f13e0599a54329bd61f74282
https://preview.redd.it/5w03oot0l91d1.jpg?width=1080&format=pjpg&auto=webp&s=4c16c448f2b3f4785c241a74eefc53e002dc42be
submitted by MariamTin to referralcodes [link] [comments]


2024.05.19 00:36 DavidKroutArt 🎲 Clicks for Clicks 🎩 - David K?

Please ask me before typing in my codes. I will be back and fourth between Reddit and another game.
USA By: David K?
C4C Game List Code
🎁 Free Gifts 240581660
🎩 Hat Trick 239258772
Please tell me-
  1. Temu Game you used my code on.
  2. Temu Game your code is on.
  3. Your username inside Temu.
Click Map -
Available
Unavailable
Unlimited ♾️
Click Availability -
🐠 Fish Land
🌾 Farm Land
🎩 Hat Trick
🎁 Free Gifts
🌴 Temu Tree ♾️?
🍵 Free Coffee ♾️?
🍀Lucky Flip
Easter Eggs Ineligible C4C Note
🎩 Trick 234806837 5/20
🎩 Trick 229115516 0/5
🎁 Gifts 229514788 blank
🎁 Gifts 233463494 blank
🎁 Gifts blank blank
Game Name Non-C4C Code Information
📦 Daily dailybox777 Daily Gift Box
🐠 Land 204281763 Friends
🌾 Land 203390830 Friends
🌴 Tree temutree0326228 Plant real trees
🍵 Fert cof0996693 Free coffee fertilizer
Game Name Extra Information
🍵 Water cof0567796 Free coffee water
🍀Flip N/A N/A
Notes:
  1. For Hat Trick and Free Gifts you can do both eligible and ineligible clicks to help get either of us easter eggs. But please let me know which ones you gave it to and give me your codes.
  2. For new users, you can only do one click a day per game. One click a week per game of someone you've already given a click to. I will check records to make sure you have given one. Hat Trick is also known as Freebies.
  3. Free Coffee has two codes. One for water and one for fertilizer. I prefer fertilizer.
Reddit Post Explaining Clicks Search for: TemuThings - "Code Exchanging (Clicks)"
submitted by DavidKroutArt to TemuCodesUSA [link] [comments]


2024.05.19 00:36 icebiker Help me buy the right used John Deere? Six good used options, but I'm totally lost!

I have 1.5ac to mow and have been using a DeWalt battery mower. Works great, but it takes 3.5h or so. I think a riding mower would save me time. I'm considering the following:
Any recommendations?
I'm somewhat handy (great with bicycle mechanics, woodworking and electrical), but no engine experience (yet!), so something reliable would be great even if it costs more.
submitted by icebiker to lawnmowers [link] [comments]


2024.05.19 00:36 DavidKroutArt 🎲 Clicks for Clicks 🎩 - David K?

Please ask me before typing in my codes. I will be back and fourth between Reddit and another game.
USA By: David K?
C4C Game List Code
🎁 Free Gifts 240581660
🎩 Hat Trick 239258772
Please tell me-
  1. Temu Game you used my code on.
  2. Temu Game your code is on.
  3. Your username inside Temu.
Click Map -
Available
Unavailable
Unlimited ♾️
Click Availability -
🐠 Fish Land
🌾 Farm Land
🎩 Hat Trick
🎁 Free Gifts
🌴 Temu Tree ♾️?
🍵 Free Coffee ♾️?
🍀Lucky Flip
Easter Eggs Ineligible C4C Note
🎩 Trick 234806837 5/20
🎩 Trick 229115516 0/5
🎁 Gifts 229514788 blank
🎁 Gifts 233463494 blank
🎁 Gifts blank blank
Game Name Non-C4C Code Information
📦 Daily dailybox777 Daily Gift Box
🐠 Land 204281763 Friends
🌾 Land 203390830 Friends
🌴 Tree temutree0326228 Plant real trees
🍵 Fert cof0996693 Free coffee fertilizer
Game Name Extra Information
🍵 Water cof0567796 Free coffee water
🍀Flip N/A N/A
Notes:
  1. For Hat Trick and Free Gifts you can do both eligible and ineligible clicks to help get either of us easter eggs. But please let me know which ones you gave it to and give me your codes.
  2. For new users, you can only do one click a day per game. One click a week per game of someone you've already given a click to. I will check records to make sure you have given one. Hat Trick is also known as Freebies.
  3. Free Coffee has two codes. One for water and one for fertilizer. I prefer fertilizer.
Reddit Post Explaining Clicks Search for: TemuThings - "Code Exchanging (Clicks)"
submitted by DavidKroutArt to TemuThings [link] [comments]


2024.05.19 00:24 sailorm00nprinc3ss QUICK $145 TODAY🔥😮‍💨

KOHO ($20 with first transaction) Join KOHO and get $20 with code: ZOFGJ54S
Wealthsimple ($25) Use my referral code: LR9BEG
Tangerine ($50) 🍊Orange Key: 54024325S1
You’ll earn the $50 when you become a Client using the Orange Key and open your first Account with a minimum deposit of $250.
CIBC ($50)
submitted by sailorm00nprinc3ss to ReferPeople [link] [comments]


2024.05.19 00:24 sailorm00nprinc3ss QUICK $145 TODAY🔥😮‍💨

KOHO ($20 with first transaction) Join KOHO and get $20 with code: ZOFGJ54S
Wealthsimple ($25) Use my referral code: LR9BEG
Tangerine ($50) 🍊Orange Key: 54024325S1
You’ll earn the $50 when you become a Client using the Orange Key and open your first Account with a minimum deposit of $250.
CIBC ($50)
submitted by sailorm00nprinc3ss to ReferralNotReferal [link] [comments]


2024.05.19 00:24 sailorm00nprinc3ss QUICK $145 TODAY🔥😮‍💨

KOHO ($20 with first transaction) Join KOHO and get $20 with code: ZOFGJ54S
Wealthsimple ($25) Use my referral code: LR9BEG
Tangerine ($50) 🍊Orange Key: 54024325S1
You’ll earn the $50 when you become a Client using the Orange Key and open your first Account with a minimum deposit of $250.
CIBC ($50)
submitted by sailorm00nprinc3ss to promocodes [link] [comments]


2024.05.19 00:24 sailorm00nprinc3ss QUICK $145 TODAY🔥😮‍💨

KOHO ($20 with first transaction) Join KOHO and get $20 with code: ZOFGJ54S
Wealthsimple ($25) Use my referral code: LR9BEG
Tangerine ($50) 🍊Orange Key: 54024325S1
You’ll earn the $50 when you become a Client using the Orange Key and open your first Account with a minimum deposit of $250.
CIBC ($50)
submitted by sailorm00nprinc3ss to Referral [link] [comments]


http://activeproperty.pl/