Written notice to landlord nyc

Found Pieces of Paper

2014.05.01 01:56 J0j2 Found Pieces of Paper

Photographs of found pieces of paper with writing on them, photographs or discarded cutouts. Appreciate the forgotten artifacts of everyday life. Share any paper that you found (on the ground, stuck in some bushes or between cans of soup at the store for example) and you do not know who wrote it. Love letters, doodles, interesting to-do or grocery lists, notes from the past - share your discovery with us!
[link]


2008.06.11 11:41 kleinbl00 HomeOwners & Investors

real estate investing landlords landlord borrowing lending mortgages foreclosure loan houses house apartment financing loans buying a house foreclosures foreclosure forbearance home buying homebuying first time homebuyer
[link]


2014.09.08 02:01 bobby_jo Exchange-Traded Funds (ETFs)

The Exchange-Traded Funds Community and Forum
[link]


2024.04.28 22:45 throwing_hayy "Reasonable entry" for house being listed this week? 3 children including 6mth old!

RTB guidelines are vague in terms of what is considered "unreasonable entry". I understand 24hr notice etc and have read the BC housing page. We live in the house and work from home, and need to be able to have our 6mth old baby go down for naps 2-3 times/day and bedtime for the children around 6pm.
What is realistic and "reasonable entry" to discuss with the landlord/realtor that won't contravene the guidelines? 2 hour maximum, 3 days per week, not after 5pm? It's very challenging with children's routines to accommodate showings. Thanks in advance!!
submitted by throwing_hayy to vancouverhousing [link] [comments]


2024.04.28 22:40 Relative-Tone-4429 My autistic partner's communication habits are bothering me.

I have a sort of issue with the way my partner talks at me. I'm not entirely sure where to pinpoint the issue, but something's bothering me and I'm not sure what.
He talks whilst TV or films are on. But that's not on its own a bad thing. The odd comment is fine in my opinion. Sharing appreciation of a moment, joke, scenary is fine.
But he does it a lot. He thinks of things sometimes related to the film, sometimes not. In the middle of a film or show, he will comment on something that's happened that perhaps similar, or reminds him of something and he can't keep it in he has to say it. When I click pause, it's like he doesn't even notice, just carries on talking going from one topic to another.
He takes a while to voice his thoughts and usually likes to fully explain himself. So he starts talking at a quiet moment in said film or show, when a comment might be acceptable, and then continues to talk for quite a while through dialogue or important scenes.
He says it's because he's seen it before or isn't interested but when he watches something he hasn't seen before and has chosen himself (so could just turn it off if he didn't enjoy it) and he does the same.
Annoyingly, he often chooses something he's seen before when we we watch a film together, but doesn't tell me. We'll be watching and he will be talking about something and the way he says it suggests he knows what's about to happen and when I ask him if he's seen it before he says yes and then tries to convince me that he already told me that.
I don't really know why he does it. He says he sees watching things as a social activity but he doesn't watch films or TV with friends, he goes out to socialise. We go out to socialise too. Most we watch things during the week in the evenings. I work as a teacher and I have to talk a lot and I've explained to him that I wind down in the evenings and need peace, but he doesn't really stop talking at me.
When I'm working from home, he talks at me and then gets annoyed at me when I don't respond or tell him I'm busy. I don't think I'm busy a lot but I do work full time. He talks all the time when he's working from home, but also complains he has to work late but he doesn't appear to concentrate on it for very long, he plays games in between, scrolls websites and searches for music and talks on forums. I appreciate different people work differently but when I work, I concentrate hard for a few hours and then I 'empty my head' and relax when I'm done and he doesn't like that very much.
He tries to make out he's done things when he hasn't. He'll ask me what my plans are when I get in for example and if I talk about needing to do something, he will insist he's already done it even when I can see he hasn't. For example he asked me what I was doing one evening and I said I needed to put the washing on, it was in the machine, I just needed to put powder in and turn it on. He insisted he'd already done it and that it was clean it just needed to go in the dryer and he whipped it away and into the dryer before I'd even gotten to it. When I took it out the dryer it was off smelling and clearly not washed, just warm and dirty. We don't normally have those sort of passive aggressive discussions where one says something needs doing because they want the other person to do it, but that's my perspective, I don't know if he takes what I'm saying as that. When I bring it up he says in a jokey voice that hes good, he does domestic things. I'm not sure what to make of it.
He often talks about things he plans to do like he's thinking out loud. I started to tune this chat out a while back because I thought he was just thinking out loud. We live in a small space. But then he said I was ignoring him. But it's not like it's a conversation. He just talks at me saying things like 'im going to go to the gym at half 2' before explaining his thoughts about the gym and what the traffic will be like etc etc. He says I'm weird for not retaining any of the information he gives me but I say I just wouldn't remember that he wants to go to the gym at half 2, he says he would remember if I said it.
He talks over me when I'm talking about something that requires more than a sentence or two. He says his brain works quickly and he's already thought of lots to say but often I haven't even finished my point, and the point he makes is tenuously linked to what I was saying and entirely related to what he was saying before I spoke, like I may as well have not spoken at all.
I have tried to explain that I feel frustrated that he isn't really 'listening' to me speak and he says he hates all those pointless phrases people use like 'oh yeah' or 'thats interesting'. He says he thinks it's far more complimentary to show someone you're listening by talking about something back to them that they made you think of.
Except he doesn't do that. He doesn't respond to what I've said at all. When I've pointed it out on occasion more recently in the moment when it's happened, he signposts me to a word or phrase I said right at the begining of talking, which bares little resemblance to the point. For example, when I'm telling him why I like a book I'm reading, I could have used a particular word at the start before explaining what I liked and he will cut me off 15 seconds in with how he got annoyed at his boss yesterday and when i ask him what that has to do with my book he relays an anecdote about him and his boss some time in the past where the word was mentioned and it made him think of the fact he was annoyed at his boss. His point will take several minutes to deliver and if I try to interject he lets me speak briefly but then just continues.
I don't talk too much myself, I don't think. I like to explain myself if there's something worth explaining, but I don't 'chatter'. I find verbal communication much harder than written. Our initial relationship was largely email focused as we were long distance for over a year and we would have long emails where many points went back and forth at the same time. He knows I'm autistic and appears to be quite aware of autism and traits etc.
Sometimes I wonder if I'm expecting too much from verbal conversation, or if I'm just not very good at it myself. I don't know if it's even a problem, or I'm just dealing with someone who communicates differently to myself. But this part of me feels something's distinctly 'off' and these are the points I keep coming back to.
Looking for advice/support/reassurance.
submitted by Relative-Tone-4429 to autism [link] [comments]


2024.04.28 22:38 Yogurtpops594 Why am I not angry enough?

I went through a pretty rough break up a little less than a year ago and I still find myself reminiscing. My friends say I have my rose colored glasses on which is definitely true but I’m just not sure how to fully process the things that happened to me. I (23f) and my ex boyfriend (23m) had a pretty toxic relationship that I am embarrassed to admit lasted longer than it should have. He has said some pretty heinous things to me and it’s like one part of my brain knows how f***** up it is but the other makes excuses still…even a year later! To break it down, we started talking, then there are some rumors he said some shady stuff to this girl at the bar. We start dating and he goes on vacation, I catch him logging onto tinder while he’s away. We continue to date, I went through his phone (impulsive and wrong choice on my part) and I catch him texting his ex girlfriend…about hanging out/laughing at me practically. We break up and I don’t know mentally what was going on in my brain because we ended up back together.
Mind you we fight all the time because anything I do is mean or disrespectful or insecure or stupid when I bring up how I feel. We fought because I was talking to my guy friend at the bar and he came later and saw me and told me to “scram”. Exact wording. We fought because I told him to stop making fat jokes about me and he lost it.
Then the real kicker was we were planning on going to NYC to meet his older sister. At first we were driving to the train station, he was driving a little reckless because I was running late (I had a bad cold). I told him to please slow down or calm down or whatever and he snaps on me that I can never be happy with anything. I of course apologized while we got on the train. I didn’t want to ruin the day and I didn’t mean to be a stick in the mud. But he didn’t think it was a sincere enough apology and once we got to NYC, he broke up with me and sent me back home.
I wish I could say that’s where the relationship ended but a part of me still liked him. We rekindled things about a month later because I somehow felt like I deserved that treatment. Things went smoothly, small disagreements but I made a promise to myself to be better (Actual conscious thought)(Sad I know).
Then he started getting mad at me for not being happy enough at dinner, not complimenting him enough (Mind you he barely compliments me without including himself), being drunk at the bar and being “disgusting”. Telling me I would be bored if I dated someone like myself.
He was a bully and mean but he wasn’t that way to anyone else. He ended up breaking up with me on the phone and telling me we were never even together because I was doing an important assignment and didn’t give him enough notice that I was gonna be late to us hanging out.
He quickly tried to get me back by saying if I’m nicer to him or whatever then we’ll be good. I said fuck that and went on a little bender. A couple weeks later, I ended up hooking up with this guy which was a mistake. Then another couple weeks later, I rekindled with my ex again. I ended up lying and didn’t tell him the full extent of what happened between me and the guy. I honestly didn’t feel like he deserved to know. My mentality at the time was you can break up with me, cheat on me, insult me, lie to me and I owe YOU the truth? I don’t even know why I started talking to him again…I think I was lonely and was seriously attached.
It did bite me in the butt because he did some digging and found out. YIKES. So he called me every name in the book, told me he’s going to ruin my life, f***all my friends and send me videos of it then blocked me.
I struggled with the guilt of lying for a good amount but I also felt like a hollow shell of a person. I didn’t know who I was and I came face to face with how low my self esteem actually was. I know I was seriously manipulated and abused.
Yet I don’t hate him even though I feel like I should or have the right to. I don’t know if my brain is possibly damaged from that relationship but I still catch myself missing him. I don’t have anger when I think of these moments. If anything I’m more sad I didn’t stick up for myself more. It’s just hard for me to be mad at him when I know I was wrong for lying. It’s like my brain is delusional or I forget how absolutely rancid he was. Does anyone know how to deal with this or what I should be doing?
Sorry for the long post. TY
submitted by Yogurtpops594 to emotionalabuse [link] [comments]


2024.04.28 22:37 Titty_Slicer_5000 Tensorflow Strided Slice Error. Need help.

TLDR at the bottom
My Full Tensorflow Code: Link. Please excuse all the different commented out parts of code, I've had a long road of trouble shooting this code.
Hardware and Software Setup
-Virtual Machine on Runpod
-NVIDIA A100 GPU
-Tensorflow 2.15
-CUDA 12.2
-cuDNN 8.9
What I'm doing and the issue I'm facing
I am trying to creating a visual generator AI, and to that end I am trying to implement the TGANv2 architecture in Tensorflow. The TGANv2 model I am following was originally written in Chainer by some researchers. I also implemented it in Pytorch (here is my PyTorch code if you are interested) and also ran it in Chainer. It works fine in both. But when I try to implement it in Tensorflow I start running into this error:
Traceback (most recent call last): File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/ops/script_ops.py", line 270, in __call__ ret = func(*args) ^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/autograph/impl/api.py", line 643, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/data/ops/from_generator_op.py", line 198, in generator_py_func values = next(generator_state.get_iterator(iterator_id)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 140, in __iter__ yield self[idx] ~~~~^^^^^ File "/workspace/3TF-TGANv2.py", line 126, in __getitem__ x2 = self.sub_sample(x1) ^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 99, in sub_sample x = tf.strided_slice(x, begin, end, strides) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler raise e.with_traceback(filtered_tb) from None File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/eageexecute.py", line 59, in quick_execute except TypeError as e: tensorflow.python.framework.errors_impl.InvalidArgumentError: {{function_node __wrapped__StridedSlice_device_/job:localhost/replica:0/task:0/device:GPU:0}} Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice] 
What's important to note about this issue is that it does not come up right away. It can go through dozens of batches before this issue pops up. This error was generated with a batch size of 16, but if I lower my batch size to 8 I can even get it to run for 5 epochs (longest I've tried). The outputs of the Generator are not what I saw with Chainer or Pytorch after 5 epochs (it's mostly just videos of a giant black blob), though I am unsure if this is related to the issue. So with a batch size of 8 sometimes the issue comes up and sometimes it doesn't. If I lower the batch size to 4, the issue almost never comes up. The fact that this is batch size driven really perplexes me. I've tried it with multiple different GPUs.
Description of relevant parts of model and code
The way the Generator works is as follows. There is a CLSTM layer that generates 16 features maps that have a 4x4 resolution and 1024 channels each. Each feature map corresponds to a frame of the output video (the output video has 16 frames and runs at 8fps, so it's a 2 second long gif).
During inference each feature map passes through 6 upsampling blocks, with each upsampling block doubling the resolution and halving the channels. So after 6 blocks the shape of each frame is (256, 256, 16), so it has a 256p resolution and 16 channels. Each frame then gets rendered by a rendering block to render it into a 3-channel image, of shape (256, 256, 3). So the final shape of the output video is (16, 256, 256, 3) = (T, H, W, C), where T is the number of frame, H is the height, W the width, and C the number of channels. This output is a single tensor.
During training the setup is a bit different. The generated output video will be split up into 4 "sub-videos", each of varying resolution and frames. This will output a tuple of tensors: (tensor1, tensor2, tensor3, tensor4). The shapes of each tensor (after going through a rendering block to reduce the channel length to 3)) is tensor1=(16, 32, 32, 3), tensor2=(8, 64, 64, 3), tensor3=(4, 128, 128, 3), tensor4=(2, 256, 256, 3). As you can see, as you go from tensor1 to tensor4 the frame number gets halved each time while the resolution doubles. The real video examples also get split up into 4 sub-video tensors of the same shape. These sub-videos are what are fed into the discriminator. Now the functionality that halves the frame length is called sub-sampling. How the function works is that it starts at either the first or second frame (this is supposed to be random) and then selects every other frame. There is a sub-sample function in both the Videodataset class (which takes the real videos and generates 4 sub-video tensors) and in the Generator class. The Videodataset class outputs 4-D tensors (T, H, W, C), while the Generator class outputs 5 because it has a batch dimension N.
This is the sub-sample function in the VideoDataset class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x 
This is the sub-sample function in the Generator class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 # begin = [0, offset, 0, 0, 0] # start from index 'offset' in the second dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3], original_shape[4]] strides = [1, frame, 1, 1, 1] # step 'frame' in the second dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[1]) // frame #print(f"Gen Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[1]}") if x.shape[1] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[1]}") return x 
You'll notice I am using tf.strided_slice(). I originally tried slicing/sub-sampling using the same notation you would do for slicing a numpy array: x = x[:,offset::frame,:,:,:]. I changed it because I thought maybe that was causing some sort of issue.
Below is a block diagram of the Generator and VideoDataset (labeled "Dataset" in the block diagram) functionalities.
https://preview.redd.it/2vh7yx2g09xc1.png?width=1862&format=png&auto=webp&s=143d5c4c8df91fc71b9da1d3858feaae28c4605a
A point of note about the block diagram, the outputs of Dataset are NOT combined with the outputs of the Generator, as might be mistakenly deduced based on the drawing. The discriminator outputs predictions on the Generator outputs and the Dataset outputs separately.
I don't think this issue is happening in the backward pass because I put in a bunch of print statements and based on those print statements the error does not occur in the middle of a gradient calculation or backward pass.
My Dataloader and VideoDataset class
Below is how I am actually fetching data from my VideoDataset class:
 #Create dataloader dataset = VideoDataset(directory) dataloader = tf.data.Dataset.from_generator( lambda: iter(dataset), # Corrected to use iter() to clearly return an iterator from the dataset output_signature=( tf.TensorSpec(shape=(16, 32, 32, 3), dtype=tf.float32), tf.TensorSpec(shape=(8, 64, 64, 3), dtype=tf.float32), tf.TensorSpec(shape=(4, 128, 128, 3), dtype=tf.float32), tf.TensorSpec(shape=(2, 256, 256, 3), dtype=tf.float32) ) ).batch(batch_size) 
and here is my VideoDataset class:
class VideoDataset(): def __init__(self, directory, fraction=0.2, sub_sample_rate=2): print("Initializing VD") = directory self.fraction = fraction self.sub_sample_rate = sub_sample_rate all_files = [os.path.join(self.directory, file) for file in os.listdir(self.directory)] valid_files = [] for file in all_files: try: # Read the serialized tensor from file serialized_tensor = tf.io.read_file(file) # Deserialize the tensor tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) # Adjust dtype if necessary # Validate the shape of the tensor if tensor.shape == (16, 256, 256, 3): valid_files.append(file) except Exception as e: print(f"Error loading file {file}: {e}") # Randomly select a fraction of the valid files selected_file_count = int(len(valid_files) * fraction) print(f"Selected {selected_file_count} files") self.files = random.sample(valid_files, selected_file_count) def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x def pooling(self, x, ksize): if ksize == 1: return x T, H, W, C = x.shape Hd = H // ksize Wd = W // ksize # Reshape the tensor to merge the spatial dimensions into the pooling blocks x_reshaped = tf.reshape(x, (T, Hd, ksize, Wd, ksize, C)) # Take the mean across the dimensions 3 and 5, which are the spatial dimensions within each block pooled_x = tf.reduce_mean(x_reshaped, axis=[2, 4]) return pooled_x def __len__(self): return len(self.files) def __getitem__(self, idx): #print("Calling VD getitem method") serialized_tensor = tf.io.read_file(self.files[idx]) video_tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) x1 = video_tensor x2 = self.sub_sample(x1) x3 = self.sub_sample(x2) x4 = self.sub_sample(x3) #print("\n") x1 = self.pooling(x1, 8) x2 = self.pooling(x2, 4) x3 = self.pooling(x3, 2) #print(f"Shapes of VD output = {x1.shape}, {x2.shape}, {x3.shape}, {x4.shape}") return (x1, x2, x3, x4) def __iter__(self): print(f"Calling VD iter method, len self = {len(self)}") #Make the dataset iterable, allowing it to be used directly with tf.data.Dataset.from_generator. for idx in range(len(self)): yield self[idx]self.directory 
The issue is happening at one point when the dataloader is fetching examples from Videodataset in my opinion, I just can't figure out what is causing it.
TLDR
I am using a runpod VM with an NVIDIA A100 GPU. I am trying to train a GAN that outputs 2 second long gifs that are made up fo 16 frames. One of the training step involves splitting the output video (either real or fake) into 4 sub videos of different frame length and resolution. The reduction of frames is achieve by a sub-sample function (which you can find earlier in my post, it is bolded) that starts at the first or second frame of the video (random) and then selects every other frame, so it halves the frames. So I am essentially doing a strided slice on a tensor, and I am using tf.strided_slice(). I tried using regular slicing notation (like you would use in NumPy), and I get the same error. The weird thing about this is that the issue does NOT come up immediately in training and is dependent on batch size. The training goes through several batch iterations just fine (and sometimes some epochs) with a batch size of 16. If I lower the batch size to 8 it's absle to go thorugh even more iterations, even up to 5 epochs (I didn't test it for longer), although the outputs are not the outputs I would expect after some epochs (I expect a specific type of noisy image based on how this model ran in PyTorch and Chainer frameworks, but I instead get a video that's mostly just a black blob through most of the resolution, just a bit of color on the edges). If I go down to a batch size of 4 the issue goes away mostly. See below for the error I am seeing:
Error:
Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice]
submitted by Titty_Slicer_5000 to MLQuestions [link] [comments]


2024.04.28 22:37 Titty_Slicer_5000 Tensorflow Strided Slice Error. Need help.

Tensorflow Strided Slice Error. Need help.
TLDR at the bottom
My Full Tensorflow Code: Link. Please excuse all the different commented out parts of code, I've had a long road of trouble shooting this code.
Hardware and Software Setup
-Virtual Machine on Runpod
-NVIDIA A100 GPU
-Tensorflow 2.15
-CUDA 12.2
-cuDNN 8.9
What I'm doing and the issue I'm facing
I am trying to creating a visual generator AI, and to that end I am trying to implement the TGANv2 architecture in Tensorflow. The TGANv2 model I am following was originally written in Chainer by some researchers. I also implemented it in Pytorch (here is my PyTorch code if you are interested) and also ran it in Chainer. It works fine in both. But when I try to implement it in Tensorflow I start running into this error:
Traceback (most recent call last): File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/ops/script_ops.py", line 270, in __call__ ret = func(*args) ^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/autograph/impl/api.py", line 643, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/data/ops/from_generator_op.py", line 198, in generator_py_func values = next(generator_state.get_iterator(iterator_id)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 140, in __iter__ yield self[idx] ~~~~^^^^^ File "/workspace/3TF-TGANv2.py", line 126, in __getitem__ x2 = self.sub_sample(x1) ^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 99, in sub_sample x = tf.strided_slice(x, begin, end, strides) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler raise e.with_traceback(filtered_tb) from None File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/eageexecute.py", line 59, in quick_execute except TypeError as e: tensorflow.python.framework.errors_impl.InvalidArgumentError: {{function_node __wrapped__StridedSlice_device_/job:localhost/replica:0/task:0/device:GPU:0}} Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice] 
What's important to note about this issue is that it does not come up right away. It can go through dozens of batches before this issue pops up. This error was generated with a batch size of 16, but if I lower my batch size to 8 I can even get it to run for 5 epochs (longest I've tried). The outputs of the Generator are not what I saw with Chainer or Pytorch after 5 epochs (it's mostly just videos of a giant black blob), though I am unsure if this is related to the issue. So with a batch size of 8 sometimes the issue comes up and sometimes it doesn't. If I lower the batch size to 4, the issue almost never comes up. The fact that this is batch size driven really perplexes me. I've tried it with multiple different GPUs.
Description of relevant parts of model and code
The way the Generator works is as follows. There is a CLSTM layer that generates 16 features maps that have a 4x4 resolution and 1024 channels each. Each feature map corresponds to a frame of the output video (the output video has 16 frames and runs at 8fps, so it's a 2 second long gif).
During inference each feature map passes through 6 upsampling blocks, with each upsampling block doubling the resolution and halving the channels. So after 6 blocks the shape of each frame is (256, 256, 16), so it has a 256p resolution and 16 channels. Each frame then gets rendered by a rendering block to render it into a 3-channel image, of shape (256, 256, 3). So the final shape of the output video is (16, 256, 256, 3) = (T, H, W, C), where T is the number of frame, H is the height, W the width, and C the number of channels. This output is a single tensor.
During training the setup is a bit different. The generated output video will be split up into 4 "sub-videos", each of varying resolution and frames. This will output a tuple of tensors: (tensor1, tensor2, tensor3, tensor4). The shapes of each tensor (after going through a rendering block to reduce the channel length to 3)) is tensor1=(16, 32, 32, 3), tensor2=(8, 64, 64, 3), tensor3=(4, 128, 128, 3), tensor4=(2, 256, 256, 3). As you can see, as you go from tensor1 to tensor4 the frame number gets halved each time while the resolution doubles. The real video examples also get split up into 4 sub-video tensors of the same shape. These sub-videos are what are fed into the discriminator. Now the functionality that halves the frame length is called sub-sampling. How the function works is that it starts at either the first or second frame (this is supposed to be random) and then selects every other frame. There is a sub-sample function in both the Videodataset class (which takes the real videos and generates 4 sub-video tensors) and in the Generator class. The Videodataset class outputs 4-D tensors (T, H, W, C), while the Generator class outputs 5 because it has a batch dimension N.
This is the sub-sample function in the VideoDataset class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x 
This is the sub-sample function in the Generator class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 # begin = [0, offset, 0, 0, 0] # start from index 'offset' in the second dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3], original_shape[4]] strides = [1, frame, 1, 1, 1] # step 'frame' in the second dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[1]) // frame #print(f"Gen Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[1]}") if x.shape[1] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[1]}") return x 
You'll notice I am using tf.strided_slice(). I originally tried slicing/sub-sampling using the same notation you would do for slicing a numpy array: x = x[:,offset::frame,:,:,:]. I changed it because I thought maybe that was causing some sort of issue.
Below is a block diagram of the Generator and VideoDataset (labeled "Dataset" in the block diagram) functionalities.
https://preview.redd.it/2vh7yx2g09xc1.png?width=1862&format=png&auto=webp&s=143d5c4c8df91fc71b9da1d3858feaae28c4605a
A point of note about the block diagram, the outputs of Dataset are NOT combined with the outputs of the Generator, as might be mistakenly deduced based on the drawing. The discriminator outputs predictions on the Generator outputs and the Dataset outputs separately.
I don't think this issue is happening in the backward pass because I put in a bunch of print statements and based on those print statements the error does not occur in the middle of a gradient calculation or backward pass.
My Dataloader and VideoDataset class
Below is how I am actually fetching data from my VideoDataset class:
 #Create dataloader dataset = VideoDataset(directory) dataloader = tf.data.Dataset.from_generator( lambda: iter(dataset), # Corrected to use iter() to clearly return an iterator from the dataset output_signature=( tf.TensorSpec(shape=(16, 32, 32, 3), dtype=tf.float32), tf.TensorSpec(shape=(8, 64, 64, 3), dtype=tf.float32), tf.TensorSpec(shape=(4, 128, 128, 3), dtype=tf.float32), tf.TensorSpec(shape=(2, 256, 256, 3), dtype=tf.float32) ) ).batch(batch_size) 
and here is my VideoDataset class:
class VideoDataset(): def __init__(self, directory, fraction=0.2, sub_sample_rate=2): print("Initializing VD") = directory self.fraction = fraction self.sub_sample_rate = sub_sample_rate all_files = [os.path.join(self.directory, file) for file in os.listdir(self.directory)] valid_files = [] for file in all_files: try: # Read the serialized tensor from file serialized_tensor = tf.io.read_file(file) # Deserialize the tensor tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) # Adjust dtype if necessary # Validate the shape of the tensor if tensor.shape == (16, 256, 256, 3): valid_files.append(file) except Exception as e: print(f"Error loading file {file}: {e}") # Randomly select a fraction of the valid files selected_file_count = int(len(valid_files) * fraction) print(f"Selected {selected_file_count} files") self.files = random.sample(valid_files, selected_file_count) def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x def pooling(self, x, ksize): if ksize == 1: return x T, H, W, C = x.shape Hd = H // ksize Wd = W // ksize # Reshape the tensor to merge the spatial dimensions into the pooling blocks x_reshaped = tf.reshape(x, (T, Hd, ksize, Wd, ksize, C)) # Take the mean across the dimensions 3 and 5, which are the spatial dimensions within each block pooled_x = tf.reduce_mean(x_reshaped, axis=[2, 4]) return pooled_x def __len__(self): return len(self.files) def __getitem__(self, idx): #print("Calling VD getitem method") serialized_tensor = tf.io.read_file(self.files[idx]) video_tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) x1 = video_tensor x2 = self.sub_sample(x1) x3 = self.sub_sample(x2) x4 = self.sub_sample(x3) #print("\n") x1 = self.pooling(x1, 8) x2 = self.pooling(x2, 4) x3 = self.pooling(x3, 2) #print(f"Shapes of VD output = {x1.shape}, {x2.shape}, {x3.shape}, {x4.shape}") return (x1, x2, x3, x4) def __iter__(self): print(f"Calling VD iter method, len self = {len(self)}") #Make the dataset iterable, allowing it to be used directly with tf.data.Dataset.from_generator. for idx in range(len(self)): yield self[idx]self.directory 
The issue is happening at one point when the dataloader is fetching examples from Videodataset in my opinion, I just can't figure out what is causing it.
TLDR
I am using a runpod VM with an NVIDIA A100 GPU. I am trying to train a GAN that outputs 2 second long gifs that are made up fo 16 frames. One of the training step involves splitting the output video (either real or fake) into 4 sub videos of different frame length and resolution. The reduction of frames is achieve by a sub-sample function (which you can find earlier in my post, it is bolded) that starts at the first or second frame of the video (random) and then selects every other frame, so it halves the frames. So I am essentially doing a strided slice on a tensor, and I am using tf.strided_slice(). I tried using regular slicing notation (like you would use in NumPy), and I get the same error. The weird thing about this is that the issue does NOT come up immediately in training and is dependent on batch size. The training goes through several batch iterations just fine (and sometimes some epochs) with a batch size of 16. If I lower the batch size to 8 it's absle to go thorugh even more iterations, even up to 5 epochs (I didn't test it for longer), although the outputs are not the outputs I would expect after some epochs (I expect a specific type of noisy image based on how this model ran in PyTorch and Chainer frameworks, but I instead get a video that's mostly just a black blob through most of the resolution, just a bit of color on the edges). If I go down to a batch size of 4 the issue goes away mostly. See below for the error I am seeing:
Error:
Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice]
submitted by Titty_Slicer_5000 to learnmachinelearning [link] [comments]


2024.04.28 22:37 Titty_Slicer_5000 [Project] Tensorflow Strided Slice Error. Need help.

[Project] Tensorflow Strided Slice Error. Need help.
TLDR at the bottom
My Full Tensorflow Code: Link. Please excuse all the different commented out parts of code, I've had a long road of trouble shooting this code.
Hardware and Software Setup
-Virtual Machine on Runpod
-NVIDIA A100 GPU
-Tensorflow 2.15
-CUDA 12.2
-cuDNN 8.9
What I'm doing and the issue I'm facing
I am trying to creating a visual generator AI, and to that end I am trying to implement the TGANv2 architecture in Tensorflow. The TGANv2 model I am following was originally written in Chainer by some researchers. I also implemented it in Pytorch (here is my PyTorch code if you are interested) and also ran it in Chainer. It works fine in both. But when I try to implement it in Tensorflow I start running into this error:
Traceback (most recent call last): File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/ops/script_ops.py", line 270, in __call__ ret = func(*args) ^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/autograph/impl/api.py", line 643, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/data/ops/from_generator_op.py", line 198, in generator_py_func values = next(generator_state.get_iterator(iterator_id)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 140, in __iter__ yield self[idx] ~~~~^^^^^ File "/workspace/3TF-TGANv2.py", line 126, in __getitem__ x2 = self.sub_sample(x1) ^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 99, in sub_sample x = tf.strided_slice(x, begin, end, strides) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler raise e.with_traceback(filtered_tb) from None File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/eageexecute.py", line 59, in quick_execute except TypeError as e: tensorflow.python.framework.errors_impl.InvalidArgumentError: {{function_node __wrapped__StridedSlice_device_/job:localhost/replica:0/task:0/device:GPU:0}} Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice] 
What's important to note about this issue is that it does not come up right away. It can go through dozens of batches before this issue pops up. This error was generated with a batch size of 16, but if I lower my batch size to 8 I can even get it to run for 5 epochs (longest I've tried). The outputs of the Generator are not what I saw with Chainer or Pytorch after 5 epochs (it's mostly just videos of a giant black blob), though I am unsure if this is related to the issue. So with a batch size of 8 sometimes the issue comes up and sometimes it doesn't. If I lower the batch size to 4, the issue almost never comes up. The fact that this is batch size driven really perplexes me. I've tried it with multiple different GPUs.
Description of relevant parts of model and code
The way the Generator works is as follows. There is a CLSTM layer that generates 16 features maps that have a 4x4 resolution and 1024 channels each. Each feature map corresponds to a frame of the output video (the output video has 16 frames and runs at 8fps, so it's a 2 second long gif).
During inference each feature map passes through 6 upsampling blocks, with each upsampling block doubling the resolution and halving the channels. So after 6 blocks the shape of each frame is (256, 256, 16), so it has a 256p resolution and 16 channels. Each frame then gets rendered by a rendering block to render it into a 3-channel image, of shape (256, 256, 3). So the final shape of the output video is (16, 256, 256, 3) = (T, H, W, C), where T is the number of frame, H is the height, W the width, and C the number of channels. This output is a single tensor.
During training the setup is a bit different. The generated output video will be split up into 4 "sub-videos", each of varying resolution and frames. This will output a tuple of tensors: (tensor1, tensor2, tensor3, tensor4). The shapes of each tensor (after going through a rendering block to reduce the channel length to 3)) is tensor1=(16, 32, 32, 3), tensor2=(8, 64, 64, 3), tensor3=(4, 128, 128, 3), tensor4=(2, 256, 256, 3). As you can see, as you go from tensor1 to tensor4 the frame number gets halved each time while the resolution doubles. The real video examples also get split up into 4 sub-video tensors of the same shape. These sub-videos are what are fed into the discriminator. Now the functionality that halves the frame length is called sub-sampling. How the function works is that it starts at either the first or second frame (this is supposed to be random) and then selects every other frame. There is a sub-sample function in both the Videodataset class (which takes the real videos and generates 4 sub-video tensors) and in the Generator class. The Videodataset class outputs 4-D tensors (T, H, W, C), while the Generator class outputs 5 because it has a batch dimension N.
This is the sub-sample function in the VideoDataset class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x 
This is the sub-sample function in the Generator class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 # begin = [0, offset, 0, 0, 0] # start from index 'offset' in the second dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3], original_shape[4]] strides = [1, frame, 1, 1, 1] # step 'frame' in the second dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[1]) // frame #print(f"Gen Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[1]}") if x.shape[1] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[1]}") return x 
You'll notice I am using tf.strided_slice(). I originally tried slicing/sub-sampling using the same notation you would do for slicing a numpy array: x = x[:,offset::frame,:,:,:]. I changed it because I thought maybe that was causing some sort of issue.
Below is a block diagram of the Generator and VideoDataset (labeled "Dataset" in the block diagram) functionalities.
https://preview.redd.it/2vh7yx2g09xc1.png?width=1862&format=png&auto=webp&s=143d5c4c8df91fc71b9da1d3858feaae28c4605a
A point of note about the block diagram, the outputs of Dataset are NOT combined with the outputs of the Generator, as might be mistakenly deduced based on the drawing. The discriminator outputs predictions on the Generator outputs and the Dataset outputs separately.
I don't think this issue is happening in the backward pass because I put in a bunch of print statements and based on those print statements the error does not occur in the middle of a gradient calculation or backward pass.
My Dataloader and VideoDataset class
Below is how I am actually fetching data from my VideoDataset class:
 #Create dataloader dataset = VideoDataset(directory) dataloader = tf.data.Dataset.from_generator( lambda: iter(dataset), # Corrected to use iter() to clearly return an iterator from the dataset output_signature=( tf.TensorSpec(shape=(16, 32, 32, 3), dtype=tf.float32), tf.TensorSpec(shape=(8, 64, 64, 3), dtype=tf.float32), tf.TensorSpec(shape=(4, 128, 128, 3), dtype=tf.float32), tf.TensorSpec(shape=(2, 256, 256, 3), dtype=tf.float32) ) ).batch(batch_size) 
and here is my VideoDataset class:
class VideoDataset(): def __init__(self, directory, fraction=0.2, sub_sample_rate=2): print("Initializing VD") self.directory = directory self.fraction = fraction self.sub_sample_rate = sub_sample_rate all_files = [os.path.join(self.directory, file) for file in os.listdir(self.directory)] valid_files = [] for file in all_files: try: # Read the serialized tensor from file serialized_tensor = tf.io.read_file(file) # Deserialize the tensor tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) # Adjust dtype if necessary # Validate the shape of the tensor if tensor.shape == (16, 256, 256, 3): valid_files.append(file) except Exception as e: print(f"Error loading file {file}: {e}") # Randomly select a fraction of the valid files selected_file_count = int(len(valid_files) * fraction) print(f"Selected {selected_file_count} files") self.files = random.sample(valid_files, selected_file_count) def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x def pooling(self, x, ksize): if ksize == 1: return x T, H, W, C = x.shape Hd = H // ksize Wd = W // ksize # Reshape the tensor to merge the spatial dimensions into the pooling blocks x_reshaped = tf.reshape(x, (T, Hd, ksize, Wd, ksize, C)) # Take the mean across the dimensions 3 and 5, which are the spatial dimensions within each block pooled_x = tf.reduce_mean(x_reshaped, axis=[2, 4]) return pooled_x def __len__(self): return len(self.files) def __getitem__(self, idx): #print("Calling VD getitem method") serialized_tensor = tf.io.read_file(self.files[idx]) video_tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) x1 = video_tensor x2 = self.sub_sample(x1) x3 = self.sub_sample(x2) x4 = self.sub_sample(x3) #print("\n") x1 = self.pooling(x1, 8) x2 = self.pooling(x2, 4) x3 = self.pooling(x3, 2) #print(f"Shapes of VD output = {x1.shape}, {x2.shape}, {x3.shape}, {x4.shape}") return (x1, x2, x3, x4) def __iter__(self): print(f"Calling VD iter method, len self = {len(self)}") #Make the dataset iterable, allowing it to be used directly with tf.data.Dataset.from_generator. for idx in range(len(self)): yield self[idx] 
The issue is happening at one point when the dataloader is fetching examples from Videodataset in my opinion, I just can't figure out what is causing it.
TLDR
I am using a runpod VM with an NVIDIA A100 GPU. I am trying to train a GAN that outputs 2 second long gifs that are made up fo 16 frames. One of the training step involves splitting the output video (either real or fake) into 4 sub videos of different frame length and resolution. The reduction of frames is achieve by a sub-sample function (which you can find earlier in my post, it is bolded) that starts at the first or second frame of the video (random) and then selects every other frame, so it halves the frames. So I am essentially doing a strided slice on a tensor, and I am using tf.strided_slice(). I tried using regular slicing notation (like you would use in NumPy), and I get the same error. The weird thing about this is that the issue does NOT come up immediately in training and is dependent on batch size. The training goes through several batch iterations just fine (and sometimes some epochs) with a batch size of 16. If I lower the batch size to 8 it's absle to go thorugh even more iterations, even up to 5 epochs (I didn't test it for longer), although the outputs are not the outputs I would expect after some epochs (I expect a specific type of noisy image based on how this model ran in PyTorch and Chainer frameworks, but I instead get a video that's mostly just a black blob through most of the resolution, just a bit of color on the edges). If I go down to a batch size of 4 the issue goes away mostly. See below for the error I am seeing:
Error:
Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice]
submitted by Titty_Slicer_5000 to MachineLearning [link] [comments]


2024.04.28 22:29 Conscious_Catch_1951 The way everyone’s posts are captioned “stagecoach day x”, “getting ready for stagecoach”

Like….. bachelorette where?! Esp Hallie/that part of the clique. They’re there for stagecoach/brand vibes and not to celebrate Emma which is so sad. I wonder if she’s even noticed. The only person that has bachelorette actually written out in the caption might be Emma herself
submitted by Conscious_Catch_1951 to macdonaldsisters [link] [comments]


2024.04.28 22:15 AppropriateLeather63 A modern review of Fallout 76

I am one of those new players who watched the TV show and bought the game on sale for 8 bucks around when it released. I've been playing after work for a few weeks now. Here is my review.
I like Fallout, having beat Fallout 3 and Fallout New Vegas as a kid, and having played Fallout 4 but never beat it. I'm also a hardcore FF14 player, so I'm highly familiar with traditional MMOs. I had heard bad things about the game when it came out, but loved the TV show so much that I wanted to hop back into the universe, and this was the only game I've ever played.
What I was hoping for from the game was just to capture that Fallout feeling. To explore the world and the lore the TV show had reinspired a love for, maybe some nostalgia from the older games.
My first experience with the game was Vault 76. This was a big let down. You are the only human in Vault 76, player or NPC. The explanation is that you woke up from last night's party late. You are immediately given some supplies and rushed out of the Vault by Mr. Handy robots. There is no real story or lore to this vault, no sinister Vault-Tec experiment to uncover, and there is no way to return to the vault as far as I know.
The Vault intros to the story are some of the most nostalgic memories from the older Fallout games. Growing up in the Vault in Fallout 3 or being rushed into it in Fallout 4. Coming back to your home Vault and either saving or dooming it was also one of the better side quests in three.
I was disappointed in this introduction to the game, and almost turned it off. But I'm glad I didn't.
I was given the choice here between starting at level 20 or level 1. I chose to start at level 20, because the game recommended it and I wasn't sure if I liked the game, so I primarily wanted to rush to see the Atlantic City content that had just come out. I was given two guns, a bunch of good supplies, and some perks. I didn't feel overpowered, but I did feel like I could hold my own in a fight. It felt like the right choice at the time, but later on, once I found myself liking the game and realized that many other players had chosen to start at level 1, I kind of wish I had chosen level 1 too. It would have made the meat of the game last longer, and better captured that feeling of struggling to survive in the Wasteland once you first leave the vault.
Upon exiting the Vault, you find yourself in a thriving forest. Kind of weird. I was expecting to exit into a dying wasteland, terrified of the hostile nature of the outside world compared to the vault. Apparently this game takes place 25 years after the bombs fall, which doesn't make a lot of sense lore-wise. How is the starting area so alive and normal 25 years after the nuclear apocalypse? It also doesn't make a lot of sense because the game basically acts like it takes place 200 years after the bombs fall later on, with everyone acting as if this is the only world anyone has ever known. Besides the starting forest, the rest of the world is pretty sufficiently devastated.
The next thing I noticed was a big positive. The combat is very, very good and very, very faithful to the older Fallout games. This was marketed as an MMO, so I was expecting MMO combat, where I would be shepherded into a tank, healer, or DPS role and cycle through a rotation of actions or spells with a GCD while I dodged mechanics. Instead, I was met with a faithful recreation of good old fashioned First Person Shooter Fallout combat. There was a wide assortment of awesome ballistic, energy, melee, and throwable weapons to scavenge. Enemies sometimes turned into ash sometimes when I used a laser weapon, or goo when I used a laser weapon. There was even a somewhat faithful recreation of VATS, which I wouldn't have thought possible in an MMO. It only slows time a little bit, but it's otherwise pretty much exactly the same. You can even get the Mysterious Stranger perk, and he feels just as powerful as he did in the mainline games. I was fighting all of the classic enemies from the Fallout universe. I found the combat to be very nostalgic and very fun. This is what kept me from turning the game off.
I began to realize that this game is much more similar to "Borderlands" than it is to "World of Warcraft" or even "Elder Scrolls Online".
I decided to sprint straight to the Whitesprings so I could get to Atlantic City and experience the newest content. Upon getting there, I was met with a series of daily quests before I could do the expeditions. These were super lame, like cooking soup or collecting steel. Typical MMO filler fair. Very disappointing. This is the second time (but the last time) I almost turned off the game. It got better from here.
The expeditions were pretty fun. This is the closest thing to a traditional MMO instanced "dungeon" that Fallout 76 has. You'll fight hordes of enemies related to the new area you're exploring, complete objectives, and get handsomely rewarded with legendary equipment, power armor pieces, and currencies. It feels like a Fallout themed action movie, which is where Fallout 76 is at its best. There will be a boss at the end. It's extremely hard to find a group for these. There is no real matchmaking system. You have to create a party and hope someone on your server decides to join. This can sometimes take hours, but you do get to go do other stuff while you wait. I have never had more than 2 people join for one of these. When they do join, they're probably going to be hundreds of levels higher than you, instead of someone your level. I'm not sure these have as much replay value as they are supposed to. Overall though, my first experience with each of the 3 expeditions I've done so far was a good time and had good rewards.
Next I tried the Atlantic City main story line. My verdict is that it's about what you'd expect from a mainline Fallout DLC. I won't spoil it, but the characters and plot are well-written and interesting. It's pretty short. The new areas and enemies are fun, I especially liked fighting against and helping the mafia, like some sort of post-apocalyptic The Sopranos.
It was ridiculously buggy, even compared to the Appalachia Fallout 76 experience, which was already buggy enough. During the last quest, my ally bugged into the wall on the way to the boss and I had to restart the game multiple times to get him out. The boss, the Jersey Devil, was cool. He was hard at my low level. He killed me the first time I tried to fight him, and again when I came back with power armor. But when I wanted to come back to try a third time, this time with power armor and a legendary gattling plasma, I found that the boss had simply already died to a bug. Anti-climatic. This game also lacks difficulty, which I'll get into more later, so it was really disappointing to see the one hard boss I've found die to a bug. The rewards for the questline were pretty meh, I got a bunch more caps than I got in normal quests, but no cool gear. But, I mean, it's Bethesda. Their games are always buggy. It was an experience that felt pretty typical of a mainline Fallout DLC. Well-written, short, buggy. Not as good as the main game, but fun.
Next I did the Wastelanders questline. This is the best Fallout 76 has to offer, at least out of what I've seen so far. It's also the closest thing Fallout 76 has to a "Main Story Questline". This is pretty much exactly what I wanted from this game. Apparently, the game lacked human NPCs before this was added, but there are tons of them in here. This is a Fallout Theme Park Action Movie. There are Vault-Tec secrets, communist bunkers, armies of evil corporate robots, cool Ghoul characters, tons of hidden lore to discover, you name it. There are moral choices to make, sides to take, stat checks during dialogue, nostalgia, epic action sequences, and handsome rewards. I went into this questline low on supplies and came out of it with epic legendary energy weapons. This questline exceeded all of my expectations, especially after the bad stuff I heard about this game. It's a ton of fun, and captures that mainline Fallout story experience I was hoping for. This alone was worth the 8 bucks I spent.
The game economy was weird. There is no global trade system or market, as is typical of MMOs. Instead, you basically have to explore the camps of other players to see what they are selling, and there are only 25 players max on a server at a time, and usually around 10. But ultimately, I think this decision kind of serves the game well. This is a post-apocalypse game, so a global economy would break the immersion. It also leads to some fun situations, where no one is selling the stuff you need so you have to pay triple the fair price or whatever. Like the time I had power armor but no fusion cores. I had spent hours trying to find them. I ran into a player on the road, asked to trade, and requested his fusion cores. He had 7, and I was hoping he'd sell me 1 for like 100 caps, but instead he sold me all 7 for 25 caps a piece. The next time I needed a fusion core, though, I couldn't find any available and ended up having to pay like 250 caps for one, which was still worth it to me because the power armor is so strong. Honestly, it captures the feel of a post-apocalyptic barter economy pretty well. It's hard to sell stuff if you aren't an established player would be the draw back. I have some rare serums for sale, but I don't have a cool CAMP or hundreds of legendary weapons for sale, so no one ever comes to my CAMP to buy them.
The CAMPS are better than I expected them to be. The player base builds some incredible stuff. I once found a CAMP with like a million water purifiers outside and a ridiculous amount of items for sale at cheap prices. This guy had a literal line forming outside his CAMP for the vendor, with a bunch of other players there for the water. Over half the server was probably there. I ended up placing my CAMP on the cliffs on the outside of The Rose Room, a defensible location with a useful location nearby. At first I didn't think I'd be into building my CAMP at all, and just planned to put down my vending machine, my stash box, and my ally. But then my ally got attacked by a group of raiders with laser weapons by surprise attacking my cliff from below. I'm unsure if this was a random event, or if I just happened to put my CAMP close to where they spawn. Regardless, I spent 20 minutes fighting them off with my ally. Feeling bad for the poor chap, who had agreed to come live with me and then found nothing but the aforementioned vendor and stash box, I decided to build some defenses and some beds. I built four turrets on the cliffsides, some fences, and some beds. It was fun. I'm not sure I'll ever be really into building the coolest CAMP, but the game got me to spend more time on it than I thought I would.
You'll pretty much never encounter other players, unless you seek them out by doing events and expeditions or go to their CAMP or join their party. Finding a player in the wastes feels like a random encounter. For the most part, this game plays like a single player fallout game, with optional multiplayer content included. I even played with the Lone Wanderer perk, which makes me stronger for playing alone. Again, more Borderlands than World of Warcraft. Heck, even Borderlands was more multiplayer than this.
There is virtually no end-game, as far as I can tell. A game like WOW or FF14 has pretty much endless stuff to do in the end game, as well as epic raid bosses that only .001% of the playerbase will ever beat. This game has none of that. It has dailies, events, expeditions, building your CAMP, and collecting even better armor and weapons. That's pretty much it. Maybe you might get into PVP, but it's a sideshow, the game isn't designed for it. You don't really need better weapons or armor, because the game is easy. Dailies, events, and expeditions are the most boring content that traditional MMOs have to offer. Once you've finished the meat of the game, you'll probably turn it off for good. It will probably be a while before this happens for me, though, because there are quite a lot of story quests, locations to explore, and lore to discover. But once you're out of stuff to do, it looks like you're pretty much out of stuff to do. Even the original borderlands had a raid boss: Crawmerax the Invincible. This game could have benefited from stuff like that. Imagine a Super Mutan Behemoth as a raid boss. Missed opportunity.
The game is ridiculously easy, one of the easiest I've ever played. For context, I've died a grand total of 5 times at level 46. I died once sprinting to the Whitesprings by the hoard of enemies following me that killed me right at the door, which was cool. I died once by getting jumped by a group of Super Mutants on the road to my quest, also cool. I died once to my first Assaultron because I was low on supplies. And I died twice to the Jersey Devil. If I had been careful in the early game and prepared for the Jersey Devil fight, I easily could have died zero times by now. Once you get power armor, energy weapons, a stack of Stimpacks and some chems, you're pretty much unstoppable. Nothing the game has to offer is going to touch you. Aside from the power armor, none of this stuff is that hard to get. By level 35, I felt like a god. This has its perks and drawbacks. It can be very relaxing to mow down hordes of enemies while you play the classical music radio station. It's fun to feel like a post apocalyptic god of war. It makes exploring the story and the lore easier, which is what this game is clearly meant for. But if you're looking for a challenge, there is none to be had here. This is like playing Fallout 3 on Very Easy mode.
Overall: 8/10.
This is a Fallout theme park action movie. If you're looking to scratch that Fallout itch after the TV show, this is the game for you. It's a lot better than it was before it was released. It has some flaws, but if you want to play Fallout, this will do the trick, especially at the price point. It's not much of an MMO, think Borderlands. And it's not very difficult. If you're looking for an MMO or a competitive scene or something, this is not the game for you.
submitted by AppropriateLeather63 to Fallout [link] [comments]


2024.04.28 22:09 ftb-house Avoiding rent and mortgage overlap on first time purchase

We're first time buyers (England) with an offer accepted, the seller has their future place already lined up so wants the sale to move as quickly as possible.
We're currently renting with a rolling contract where we need to give two months notice to move out (we pay on the 1st of each month). We'd be happy having one month of overlap with the mortgage so we can paint the house and gradually move, but we really want to avoid paying a second month of overlapping rent.
We're not 100% sure what to expect / how to avoid too much overlap:
  1. I assume that we would give notice to our landlord as soon as we have exchanged contracts?
  2. How is the completion date usually decided? (Do both parties get to negotiate a completion date, or does the seller usually have more of a say?)
  3. If we do get a say, would we just ask that completion takes place one month after exchange, giving us a one month overlap?
Thanks!
submitted by ftb-house to HousingUK [link] [comments]


2024.04.28 21:56 lisrh want to break verbal lease after 3 months - landlord is police officer

Me and roomie are going thru hell with our current landlord who happens to be a police officer. We asked for lease contract for 1 year when we moved in but he never gave it to us, he just said “Ok” for the 1 year stay. His facebook listing said minimum 3 months. Its been 3 months now and this is our worst LL ever. He lives with his family in the upstairs portion of the house.
We want to move out. We even thought of being respectful and giving a months notice bc thats how we are. How do we go about asking him to give us our deposit back politely and telling him we want to leave? Court is out of option. He’s a police officer. There is no physical lease. Everything was agreed thru texts.
Let me tell you: I have been renting for 5 years and have amazing landlord-tenant relationships. One of my landlords even became a close friend, invited me on family trips abroad etc. so this is not a me problem and I know this for sure. We have been extremely kind to them and followed every rule. We even paid electricity for the month we didn’t stay here to avoid a fight.
submitted by lisrh to Tenant [link] [comments]


2024.04.28 21:52 hollapainyo Landlord not responding to messages saying I will be vacating the apt in July - how to handle?

Hi! I've been in my studio apartment in Ditmas Park for 5+ years now. Landlord has been relatively absentee but not horrible re: major issues.
Here's the situation - I signed a one year lease in Feb. 2019, and despite requesting another one when it expired, I never heard anything back and never got another lease, so I've been just sending a check every month. Since I have no lease I imagine I'm month to month.
I'm moving in with my girlfriend in July and have been trying to tell the management company that I'll be vacating the apartment at that point since early April. I've sent two emails and left two messages with the management company and have heard absolutely nothing back.
How should I handle this? I'm trying to give as much notice as possible but the lack of reaponse is getting a bit worrying. I'm thinking about writing to say I will be using my security deposit as the last month's rent barring any acknowledgement of my move/returning my deposit from the management company. Does that sound reasonable?
submitted by hollapainyo to NYCapartments [link] [comments]


2024.04.28 21:49 Ashlxy First time reading the four gospels, my personal experience.

Hello, first of all i want to say i don't post on reddit so if the format is off that is why.
I'm quite interested in history so i decided to start reading the bible around 1 year ago since i knew the bible had real places and events written in it. After reading some of the old testament (KJV) i stopped during Leviticus, i don't remember my exact reason for stopping but i do remember having a lot of tough questions. Recently i got recommended a guy by the name of Cliffe Knechtle via youtube recommendations and i thought to myself "ah just another Christian preacher" but for some reason i clicked on it and was speechless. He was tackling all the tough questions i had about Christianity and all of his answers made sense to me, genuinely for the first time in my life my core atheist positions where shaken. I had not known the gospels where eye-witness accounts, i had also never learned about jesus christ, everything including what my parents had ever told me about jesus was false and this is something i felt in my heart. It wasn't just the eye-witness accounts of jesus that shook me but the fact they where willing to die for jesus christ after they had initially disowned him before the resurrection. As Cliffe Knechtle said "people don't die for something they know to be a lie" , i am interesting in human psychology and have studied a little psychology at university, in my human experience i trust this to be true. It's not just one individual but multiple that died for their eye-witness testimonies in jesus and this really changed my perspective. After reading how jesus spoke and treated other people, it's something i look up to and want to learn from. As the title states i have only read the 4 gospels (NIV), this would be Mathew, Mark, Luke and John but i will definitely continue reading further.
My entire life (26 years) I've been told Christianity is a ridiculous story or that Christians take advantage of the desperate. I've been raised in a household that is atheist/spiritual, my father is a firm atheist and my mother is a spiritualist who believes she is talking to dead spirits and rejects jesus. Everybody i know laughs at Christianity and rejects it, and so did i for most of my life. I've struggled with depression and anxiety for most of my life because i didn't see a meaning to life or any point in it. My biological father hasn't been around my entire life and my mother had me at 17 so i thought i was a mistake but now i'm genuinely starting to think otherwise.
Since reading the 4 gospels my mental health has definitely got better and other people have noticed as well (although i keep any Christian beliefs to myself). I started praying (although i don't know how to officially) sometimes but i feel like a fake because i was against jesus for my entire life, my prayers are usually asking for forgiveness or praying for my family. I can't explain why but one verse made me cry and i very rarely cry, Mathew chapter 11 verse 28 (NIV) i still remember it clearly the feeling, it felt like a weight had been taken of my shoulder or similar to what relief feels like.
There are quite a few chapters and verses that really touched my heart but a few are: John chapter 15 verse 1-17 Mathew chapter 10-11 John chapter 8
Thank you for whoever has read through this post, i apologise if it was messy as i don't write much or post onto this website.
submitted by Ashlxy to Christianity [link] [comments]


2024.04.28 21:43 RyanBThiesant DWP Universal Credit overpayment should not be 100% of your monthly payment.

The Universal Credit DWP PAYEE detection system is broken. Because of this a lot of UC overpayments happen. The DWP look for cases where there might be an overpayment and then quickly and unlawfully take the money back.
(1) they should give you a notice with appeal rights.
(2) they should not take back 100% of the payment from your UC every month. they are limited to 25% of your usual monthly payment.
(3) they do not have to take the money back if it will cause hardship.
(4)
----------------------------
“Recovery from Universal Credit
5.19. See Appendix 2 for non-fraud overpayments where the claimant has earnings in excess of the Universal Credit disregard, this deduction is 5 times 5% of the appropriate Universal Credit standard allowance rate.”
https://www.gov.uk/government/publications/benefit-overpayment-recovery-staff-guide/benefit-overpayment-recovery-guide'
-----------------------------------
S.7(3)(a) of the UC, PIP, JSA and ESA (Decisions and Appeals) Regs 2013 specifies that in order for the decision to be subject to the MR process the decision notice must inform the person of the time limit for making an application for revision. S.7(3)(b) makes a similar requirement in relation to requesting a written statement of reasons for the decision.
reg 51(2) of the D&A Regs requires the SSWP to give notice of a right of appeal in any decision which has one
---------------------------
"What to do if you think this decision is wrong?
If you think the decision is wrong, please get in touch with us by telephone or in writing within one month of the date of this letter. If you do not contact us within one month of the date of this letter we may only be able to change the decision
from the date you contact us. Our telephone number and address are on the front page of this letter.
You can appeal against this decision, but you cannot appeal until we have looked
at the decision again. We call this a Mandatory Reconsideration.
You, or someone who has the authority to act for you, can:
ask for an explanation of the decision, or
ask for a written statement of reasons for the decision, if we have not already sent one
ask us to look at the decision again, to see fi ti can be changed. There may be some facts you think we have overlooked, or you may have further information that affects the decision.
When we have looked at the decision again, we will send you a letter explaining what we have done. We call this a Mandatory Reconsideration Notice. This will Include the Information you need to be able to appeal,"

submitted by RyanBThiesant to FairlyLegalAdviceUK [link] [comments]


2024.04.28 21:38 Extension_Concern935 AITA for thinking that my family doesn’t really want me.

I'm sorry if something is written wrong, but English is my second language.
My parents have four daughters. I’m the second but the youngest sister was born with the biggest age gap compared with all the three of us. We were then at ages: first 13(Cristina) second 9(me) third 7(Julia ) when she was born(Mandy). Those all are fake names. When i was a kid i didn’t realised how much different especially my mom treated me.
I started noticing that when my third born sister for this we can call her Julia, started calling me names and never got in trouble for that but when i wanted to fight back not to just stand there and listen what she had to say, no matter what i did even tell her to just shut up i would be grounded and it wasn’t something like she went to parents and tell what i did and just wouldn’t believe me, they would be standing next to us i would be given a lecture.
With Cristina was easier when i got older because as long as i remember i was literally her maid. i would make her food coffee bed clothes everything and you can think that only thing she has been doing for herself was studying and you’re unfortunately wrong if she was studying i wasn’t just quizzing her i had sit in her room and listen to what she was reading even though she is 4 years older than me and i couldn’t understand single thing she was saying. When i was around 11 or 12 years old i told her that i no longer would be helping her and she made my life miserable ever since she throws her dirty clothes into my closet steels my makeup, tells mom everything i didn’t do right, and now only chores that but it is better than going around doing both my and hers stuff and of course im evil and sellfish but sorry you have to earn my help i don’t do it for free, she also never help me with a thing i asked for.
submitted by Extension_Concern935 to AITAH [link] [comments]


2024.04.28 21:38 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.
He discovered that our bodies possessed an auric field and in one study; https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf was able to detect a malignant ovarian cancer thanks to this.
Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?
In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.
Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!
Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?
Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.
Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!
That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.
This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.
It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.
Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.
P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to wimhof [link] [comments]


2024.04.28 21:37 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.

He discovered that our bodies possessed an auric field and in one study; https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf was able to detect a malignant ovarian cancer thanks to this.

Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?

In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.

Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!

Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?

Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.

Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!

That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.

This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.

It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.

Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.

P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to DreamInterpretation [link] [comments]


2024.04.28 21:36 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.
He discovered that our bodies possessed an auric field and in one study;
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf
was able to detect a malignant ovarian cancer thanks to this.
Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?
In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.
Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!
Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?
Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.
Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!
That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.
This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.
It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.
Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.
P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to immortality [link] [comments]


2024.04.28 21:36 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.
He discovered that our bodies possessed an auric field and in one study;
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf
was able to detect a malignant ovarian cancer thanks to this.
Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?
In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.
Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!
Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?
Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.
Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!
That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.
This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.
It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.
Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.
P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to Biohacking [link] [comments]


2024.04.28 21:36 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.
He discovered that our bodies possessed an auric field and in one study;
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf
was able to detect a malignant ovarian cancer thanks to this.
Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?
In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.
Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!
Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?
Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.
Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!
That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.
This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.
It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.
Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.
P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to lifeextension [link] [comments]


2024.04.28 21:35 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.
He discovered that our bodies possessed an auric field and in one study;
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf
was able to detect a malignant ovarian cancer thanks to this.
Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?
In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.
Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!
Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?
Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.
Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!
That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.
This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.
It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.
Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.
P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to fringescience [link] [comments]


2024.04.28 21:35 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.
He discovered that our bodies possessed an auric field and in one study;
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf
was able to detect a malignant ovarian cancer thanks to this.
Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?
In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.
Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!
Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?
Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.
Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!
That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.
This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.
It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.
Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.
P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to wanttobelieve [link] [comments]


2024.04.28 21:34 Vib_ration An electrical field that surrounds all living things and that can be used to predict diseases has been proven to exist.

Dr. Harold Burr was an Anatomy professor at the Yale University School of Medicine. He published 93 scientific papers regarding the nervous system and bio-energetic phenomena over a forty-year period.
He discovered that our bodies possessed an auric field and in one study;
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2602176/pdf/yjbm00523-0035.pdf
was able to detect a malignant ovarian cancer thanks to this.
Why has medical science ignored the extraordinary breakthroughs from Professor Harold Burr?
In 1910, John D. Rockefeller funded the development of a doctrine called the Flexner Report. It enabled the AMA to monopolize Western medicine with a focus on pharmaceuticals. This successfully destroyed the development of and usage of natural health care methods—labeling anything other than pharmaceutical drugs as unscientific, pseudoscience and woo.
Although his works can be found in the archives of Yale Journal of Biology and Medicine, it is not mentioned in biology textbooks!
Since abnormalities in your auric field can provide advanced warning of future problems, can you imagine how intentionally shaping your field could benefit you?
Doing this is possible and very simple, you must first recognize that you have come in contact with the key for this, through your own activation of your Bioelectricity.
Think about how a simple thought can give you goosebumps all over your body. Your whole physiology will change for a couple of seconds even minutes by raising the hair all over your body thanks to a simple thought!
That's you activating your Bioelectricity, the same energy that you can use to shape your auric field to your advantage.
This is what people experience as Frisson, or as the Runner's High, or as the Vibrational State before an Astral Projection, or as Qi in Taoism and in Martial Arts, or as Prana in Hindu philosophy and during an ASMR session.
It was researched and documented under many names, by different people and cultures, like Bioelectricity, Life force, Prana, Chi, Qi, Runner's High, Euphoria, ASMR, Ecstasy, Orgone, Rapture, Tension, Aura, Mana, Vayus, Nen, Intent, Tummo, Odic force, Kriyas, Pitī, Frisson, Ruah, Spiritual Energy, Secret Fire, The Tingles, on-demand quickening, Voluntary Piloerection, Aether, Chills, Spiritual Chills and many more to be discovered hopefully with your help.
Here are three written tutorials to help you learn how to specifically do this, to consciously activate it at a higher level and for long duration to shape your auric field.
P.S. Everyone feels it at certain points in their life, some brush it off while others notice that there is something much deeper going on. Those are exactly the people you can find on spiritualchills where they share experiences, knowledge and tips on it.
submitted by Vib_ration to theunexplained [link] [comments]


http://swiebodzin.info