Https getus in

Confusing Perspectives

2014.06.28 06:29 Confusing Perspectives

Blackout megathread in Save3rdPartyApps: https://redd.it/1476ioa
[link]


2015.04.29 08:35 TWICE (트와이스)

For JYP Entertainment's TWICE, by ONCE.
[link]


2013.08.14 15:08 andreasw Antiwork: Unemployment for all, not just the rich!

A subreddit for those who want to end work, are curious about ending work, want to get the most out of a work-free life, want more information on anti-work ideas and want personal help with their own jobs/work-related struggles.
[link]


2024.03.19 22:19 EricLeib New library for simplifying API specs in full-stack typescript projects

TLDR: I'm working on a library and would like some feedback.
This library aims to simplify the interface between back-end and front-end in full-stack Typescript projects (eg. Nestjs API and Angular app).
A common practice to ensure consistency between back-end and front-end is to do the following:
What if we could skip the code generation step, reduce the amount of decorators in the back-end code, without losing the benefit of type-safety and request validation?
Here is the approach I am taking:

const user = z.object({ firstName: z.string(), lastName: z.string(), }); 

const createUser = operation({ method: 'POST', body: user }); const getUser = operation({ method: 'GET', path: '/{userId}', routeParams: z.object({userId: z.coerce.number(),}), response: z.array(user), }); const getUsers = operation({ method: 'GET', queryParams: z.object({userId: z.coerce.number()}).optional(), response: z.array(user), }); 

const userResource = { path: '/users', tags: ['Users'], operations: { createUser, getUser, getUsers } } 
From there, you get 3-for-1:
Here is what the controller looks like (I played with a few variations of this, but this is the one I prefer so far):
@Api(userResource) // This automatically maps all the routes below @Controller() class MyController implements ApiService { async createUser(@ApiRequest() req: req) { // req.body is defined and validated } async getUser(@ApiRequest() req: req) { // req.route.userId is defined and converted to number } async getUsers(@ApiRequest() req: req) { // req.query.userId is defined (optionally) } } 
Here is a gist with a full example to give you an idea.
submitted by EricLeib to nestjs [link] [comments]


2024.02.26 04:17 Leanador Using Expo Router, how to render different component if user already signed in?

Hi,
I am looking to render the correct screen when my app loads, given if the user has already signed in or not. I have been stumped finding a solution for a few days, so any help would be greatly appreciated. Thank you in advance!
CONTEXT:
Here is my file structure using Expo Router:
root ⤷ (app) ⤷ (tabs) ⤷ home ⤷ index.tsx ⤷ (auth) ⤷ index.tsx _layout.tsx 
Essentially, (auth)/index.tsx is a page with sign-in and sign-up buttons, and (app) is where the remainder of the application is.
I created a context provider for authentication called AuthProvider, which determines which screen to route to given some state changes. I have this context provider set at root/_layout.tsx
... return (    ) 
PROBLEM:
However, AuthProvider has an issue with routing to the correct initial startup screen. Here is some AuthProvider logic:
... const [user, setUser] = useState(null); const router = useRouter(); const rootSegment = useSegments()[0]; useEffect(() => { // No user -> go to (auth) if (!user && rootSegment !== '(auth)') { router.replace('(auth)'); } // No user and in (auth) -> do nothing else if (!user) { return; } // User -> go to (app) else if (user && rootSegment !== '(app)') { router.replace('(app)/(tabs)/home'); } }, [user, rootSegment]) ... 
I use Supabase for my backend, and I believe the most guaranteed way of checking if a user is signed in with a valid session is with supabase.getUser(), which returns the user.
An idea I have in mind is the following:
I need to somehow initialize user with supabase.getUser() only once when the app loads, then call AuthProvider.setUser() (defined below) so it triggers the useEffect hook and immediately route to (app).
MY FLAWED SOLUTION:
I added a new useEffect with no dependencies to call supabase.getUser() on the first render, which almost works. The flaw is that the (auth) screen renders for a split second before the if (user && rootSegment !== '(app)') condition above is true and router.replace('(app)/(tabs)/home') is called.
I believe I may be overthinking this problem, so if anyone has any ideas, that would be awesome. Thanks again for taking the time to read this!
submitted by Leanador to expo [link] [comments]


2024.01.24 22:10 MysticAttack Midgame macro and carrying from the jungle

I posted here 2 weeks ago trying to figure out how to carry in a 1v9 setting (https://www.reddit.com/summonerschool/comments/192p4uc/how_to_getuse_agency_from_the_jungle_especially/?utm_source=share&utm_medium=web2x&context=3) and my gameplay has improved since, but I think I've figured out an issue with my gameplay.
The first is how to transition from the laning phase into the midgame. I feel kinda lost because it either boils down to ARAM or teammates overextending on sides. Naturally, the obvious answer (it seems to me) is to shadow sides/ support the aram (if the enemy is also there) to prevent throws, but I'm struggling to figure out how to keep up my farm and therefore advantage without leaving my teammates to die while I do so
The other is my general inability to carry. Something I've noticed is that on most of my losses I'm by a mile the best on my team, but on wins I'm either kinda carried or someone on the enemy team is running it. I initially attributed this to unlucky games and bad teammates, but with the continuing trend, I'm realizing that I'm 'the best' because I get an early lead, and then eventually it gets to the point where I'm the main/only strong player on my team, and I'm not properly using my advantage. So even though my teammates *are* playing bad, that's not why we're losing, I'm failing to push my advantage and I then lose it.

https://www.op.gg/summoners/na/MysticAttack-1660 I've pretty much solidified on Lillia, Kindred, and Viego as my mains
submitted by MysticAttack to summonerschool [link] [comments]


2023.10.31 15:02 DiegoDarkus Setting a cookie in nextjs 14 🥲

I have this nextjs app where i'm trying to setup my authentication/authorization system, it's pretty simple, i have a refresh token (long duration) which i store in a httpOnly cookie, and i have also an access token (short duration) in them cookies (*at first, i wanted to store my access token as a `Bearer access-token` header, but, setting that header up in server components and the client side got me crying for almost a week since it is impossible*).

So, if my access-token cookie expires, i need to call my api (this is where **problem** shows up) using this endpoint (`/access-token`) that generates the new token, and sets the cookie, but, if i'm calling this endpoint in a server component ***the cookie won't be set in my browser***, and i understand that since a server component is running on the server side and not on client side.

Then i said, aaight cool, i'ma set the cookie in my nextjs app, so, when i call the same endpoint, i return the access token and set it using `import {cookies} from 'next/headers'`, but it didn't work since server components can not have side effects for cache and design purposes (and things of that nature).

Then i go to the Docs, where they say that if i want to set a cookie, i need a server action or a route handler, and this is what i did.
My server component (home page **/**) 👇
const fetchUser = async (): Promise => { await getAccessToken("aaight"); // await axios.post( // "http://localhost:3000/api/set-cookie", // {}, // { // headers: { // Cookie: cookies().toString(), // }, // timeout: 5000, // } // ); const refreshToken = cookies().get("refresh-token"); // console.log("refresh token", refreshToken); if (!refreshToken) { return undefined; } console.log("\ncookie set\n", cookies().getAll()); // get user const getUser = await foodyApi.get>( "/users/me", { headers: { Cookie: cookies().toString(), }, timeout: 10000, } ); return { user: getUser.data.rsp.user, }; }; 

Server action (**/app/action.ts**) 👇 (this is the first fn that runs in my server component because i need to do this types of operations before streaming starts)
"use server"; import { cookies } from "next/headers"; export const getAccessToken = async (token: string) => { cookies().set("access-token", token, { path: "/", domain: "localhost", maxAge: 300, httpOnly: true, secure: false, }); }; 

This also didn't work, and i got an error ❌ that says.

⨯ Error: Cookies can only be modified in a Server Action or Route Handler. Read more: https://nextjs.org/docs/app/api-reference/functions/cookies#cookiessetname-value-options
at getAccessToken (./app/action.ts:15:61)
at fetchUser (./app/page.tsx:20:66)
at Home (./app/page.tsx:51:24)
at async Promise.all (index 0)

Then i try with an api route handler 👇 (**/app/api/set-cookie**)
import { cookies } from "next/headers"; export const POST = async (req: Request) => { // FIXME: check we got refresh token and api key console.log(cookies().get("access-token")); cookies().set("access-token", "we did it"); return Response.json( {}, { status: 200 } ); }; 

That didn't work as well.

**It's been a week** for me trying to setup a normal auth system for my app, and i've changed things about my system so that i can use nextjs, but i'm having problems again, how am i supposed to set my cookie now ?¿ redirect to another page with `use client` mark and set the cookie then redirect here to my page with the server component ?¿ that work around sounds horrible, and i don't know about user experience man...
submitted by DiegoDarkus to nextjs [link] [comments]


2023.10.08 05:26 GuyBelmont New Guy Belmont and Yuki Getsu character Bios

New Guy Belmont and Yuki Getsu character Bios
There here the new Character Bios all art has been done By the amazingly talented Morine / JoshYal.Art
A Family History smeared in blood
Guy's equipment
The Vampire killer Excalibur whip
the legendry holy whip that is bane of the night, passed down through the Belmont family for generations. the history of how the whip came to be has been mostly Lost to time, Only a hand full of people know. What is know however that the not only is the whip mind bendingly Holy powerful.
It is also VERY versatile being able to make use of Magic stones, tips and Earth orbs, to power up the whip.
Its also rumoured that the Gandolfi family though out the ages have hidden many upgrades For the whip in Mausoleums and other such places all over the world. to help further strengthen the Vampire Killer.
Simons plate
This legendry plate has seen many horrible nights but has helped vanquished the evil with the morning Sun.
cyclone boots
Legendry boots that have been said to give the wearer tremendous speed and jumping abilities.
Seraph Shoulders
This legendry relic is said be made from the wing of an arch angel giving the user the ability to sore to great Heights.
Legend say that House Belmont was gifted with this Divine relic to better aid them in there Hunting of the night.
Belmont Gauntlet
A gauntlet that has been passed down through the clan for generations. it has been updated many times. it has mysterious powers.
special techniques
Holy Flame whip: when Guy charges up his Holy power, he is able to turn the Vampire Killer in to a Powerful holy flame whip, the higher guy's will power rises, the stronger the Holy flame whip becomes and the Longer it will last.
Divine Rush: This attack combine speed and power. Guy strikes the whip with incredible speed and power so much that the whip becomes bluer of glowing white. faster and more powerful the whip gets the brighter it glows there is a also a purple hue that can bee seen around the whip.
This technique not only creates a vortex that draws any unholy being in, but with each devastating strike creates a powerful sonic boom destroying both inside and out. any unholy being caught in this attack will be hit 999 times. Its one of Guys best know attack.
He saw a technique in the Belmont Tome very much like this one, so he use it a base to fit his own Fighting style creating this powerful technique.
Omnia Vanitas : "All is Vanity" an ancient technique was brought back with Leon Belmont from his trips around the world to become a better Monster hunter, This technique originates In japan. The Techniques Original name was Shikisokuzekū " All is nothing, nothing is all "
This Powerful Technique make the user invulnerable and it lets the user Evade enemies and pass though solid objects like Monsters or fire walls leaving only your aura behind. it also allows the user to to cover a lot of ground using this Technique. However there is a down side not only is the Technique Very hard to master due to its complexity , but it also consumes a LOT of energy to use it, that goes double for prolonged usage. It was due to these complexity it got shelved in the hold, and forgotten to time, It was only found Buy Guy Belmont In 2994 when he was exploring the hold. and trying to absorb as much info on his family and there methods. is when he came across it. after spending mouths he finally managed to mastered it.
Standard Sub weapons
Like his ancestors he is armed with the standard sub weapons
Axe
Holy water
Dagger
cross
stopwatch


The Light In the Darkness.
Yuki's equipment
Alucard spear
The legendry spear that was said to have been forged from the remains of the spear that beheaded Vlad Țepeș.
The legend goes years later the Legendry hunter Alucard found the remains of the spear and sensed a great powerful aura emanating from it. So he decided to reforge it . By using his family's book of secret arts and then fusing it with the original Trident stake Of the Vampire Killer Whip, that was said to have been removed during a battle.
And so with the reforging possessive completed The Alucard Spear was born, and Alucard had achieved in goal of creating a complement weapon to the holy whip
Getsu clan Armour
Family Armour that has been passed down for Generations . they say the armour was gift by the great dragon god. its believed that the armour was made out of his owns bones
Hadou katana
This spiritual blade which once swung, releases a wave of energy inflecting Incredible damage to Evil.
This is the last Known Hadou Katana, and This treasured blade has been passed down throughout the clan for generations. being intrusted with the blade is Only true way to become head of the Getus clan.
So One of the clan candidates who wants to become head of the clan. must Prove themselves worthy of the Blade. Once a candidates has proven themselves worthy. they are intrusted with the Hadou Katana and they are they now head of the Clan. The head is now able to enter all of the secret parts of the Getsu Clan Estate.
Legend speaks that the other two other blades MAY still be out there. But this is often dismissed as just fairy tail or folklore.
This spiritual blade is made of Both Light and darkness, A user is able to use one element at a time, turning it into a blade of light or darkness. BUT only 3 users in all of the clans History have been able to use both light and dark at the same time.
the sister who went to Raging Demon Island to avenge her fallen older brothers and retrieve the Other Hadou Katanas and seal away Ryukotsuki. And Its this Brave women who started Yuki's Line.
Kasmui Yuki's mother is also another who can use both Light and dark at the same time when using the blade. And finally Yuki too is able to use Both Light and darkness at the same time when she uses the blade.
Iwa no Ken- rock sword.
it said this blade is so powerful it can cut though almost anything, Given to Yuki by her mother. It was Yuki's first weapon before the Alucard Spear.
Fuma's Greaves
A great relic passed down though the Getsu clan they are made of bone, But no one is sure what sort if bone... Legends say they Increases running speed.
special technique
Dainendōha**: "**Great Will Aura": When Yuki Fuse the other 2 Hadou katana she is able to Unleash and unbelievably Powerful attack Known as the legendary Dainendōha.
The power of dominance: the Getsu clan have the ability to absorb "Tamashii" "soul, spirit." from slan monsters, and by doing this it heals there wounds a little bit.
However Yuki is able to Hold on to the soul of monsters she kills and then bring them out or use its powers. when Kasumi "Yuki's mother" first discovered Yuki was able to do this when she was at the age 7 Kasumi was not sure what to make of it. when she asked Yuki about Yuki gave a simple answer
" I feel real sorry of em bein all cooped in there, so i like to let em out to breath once in a while. don't worry mama they won't go and do bad things any more they Promised." she smiled so earnestly and even though this technique comes from the darkest of places, she could sense no darkness coming from Yuki. So Kausmi knowing that sadly due to what and who Yuki was she would most likely be hunted down one day, so she wanted to train Yuki the best she could so she would be able to defend herself.
told her to keep up with her training of this technique but do it in secret as no one would understand. And so the Yuki did and she now has Incredible control over it.
Head chain of love and protection
This head chain has the power to help protect from all forms of attack. when you hold it in your hand you can feel a warm feeling of love speared though your whole body. said to have been a gift from the progenitor of the Belmont clan to his betrothed. Given as a gift to Yuki from guy, as a way of signifying the bond between them.
Getsu Clan regulation attack arms
She is equipped with the same attack arms her clan is famous for
The Defensive Drum
The Curse Explosive
The Shuriken
The Defensive Ball
The Devilish Top
Thank you all for looking at these Bios
Thank you everyone for all you're Support .
submitted by GuyBelmont to DraculasCastle [link] [comments]


2023.09.26 18:29 jezmilar Managed PostgreSQL with Next.js 13 (without an ORM) - optimal way to set it up? [code inside]

I just set up a free database on elephantsql.com (could've used Heroku, Digital Ocean, GCP etc...) and connected it to my Next.js project. An application that'd used by many users making multiple queries to the db throughout the day. So I used new Pool() instead of Client. I could also use Supabase and something more of a "complete" solution but I want to have the flexibility for now and keep it simple.
Question: with Next.js 13 (App router) is this the right/optimal way of connecting to my db and making queries? My goal is not to overwhelm the db with many unnecessary connections and go over the usage limits (and unexpectedly be hit by a high bill).
I kept refreshing the page locally (localhost:3000) on multiple tabs - elephantsql.com showed only 1 connection while making queries at the same time (get users) to the db. So I guess that's a good thing?
Here's my project's structure, I set it up roughly like this.
.env
CONNECTION_STRING = postgres://foo:asd-asd-asd@bar.db.elephant.com/lorem
db/db.ts
import { Pool } from "pg"; const connection = new Pool({ connectionString: process.env.CONNECTION_STRING, }); export default connection;
app/api/get-users/route.ts
import connection from "@/db/db"; export async function getUsers() { const query = "SELECT * FROM users"; const users = await connection.query(query); return users.rows[0] as { email: string; }; }
app/page.tsx
import { getUsers } from "@/app/api/get-users/route";
export default async function Home() { const data = await getUsers(); return (

{data.email}

); }
Is this a simple solution that could work? Do you have any tips? Thanks in advance! 😊
submitted by jezmilar to webdev [link] [comments]


2023.09.20 12:39 kzovk State Management patterns between Server and Client components

State Management patterns between Server and Client components
Hi everyone.
I want to clarify State Management patterns in the NextJS App router. It kinda feels like an obvious topic because there are literally no posts about it, so maybe I'm overthinking this. However, I've built a few small apps already, and each time I felt I was doing something wrong.

❗️Disclaimer: I am fairly new in the coding world, so I might be doing something fundamentally wrong. Please roast my approach and point out all the faults in my thinking.

To simplify some stuff, let's say we have the following structure in the project. Let's assume we have only two types of data: User and UserProjects (that rely on UserID).
https://preview.redd.it/dc477zgxxdpb1.jpg?width=2728&format=pjpg&auto=webp&s=ba5f1f410cbe9e2cdca38367844be581cb22b555
---
Now, I can see three options to get the correct data:
Option 1: Server and drilling -- Data is fetched in the Layout and Page, then passed down to the components below.
https://preview.redd.it/nks3xvn7ydpb1.jpg?width=2670&format=pjpg&auto=webp&s=eeeac6bd28a80b89f66f87e7bf2b5f6e92f6d53c

Option 2: Server and drilling and react query --
User data:
  1. Layout - gets the data on the server and passes them down to Server and Client components, direct children that use the User Data
  2. Client Component - fetches the data using React Query and Server Action, because direct parent is not a server component
Project data:
  1. Is fetched directly in each component, once through server and once through ReactQuery + Server action.
https://preview.redd.it/mm92ojl9ydpb1.jpg?width=2748&format=pjpg&auto=webp&s=39338e4cf7008faf3cad031e9bf7b30e4d7f9a3b
Option 3: Server and react query -- Data is fetched directly in the components that need the data, using Server functions or React query - depending on the component type.
https://preview.redd.it/hwav7j1nydpb1.jpg?width=2900&format=pjpg&auto=webp&s=c27f9bd3313e8538a0df8daf937befbadb204368
Now, we have to remember that I'm using React Query with Server actions - so the function for fetching this data is basically the same for client and server - it's just wrapped with React Query for the Client Components:
https://preview.redd.it/1t2i8xfe0epb1.png?width=1266&format=png&auto=webp&s=f5c4b408d13622bb844e4ef55fc6d30e41d2f89e
Now, I'm asking because I used the mix of 2 and 3 to find the best pattern. However, I do feel that using direct fetching in the server component makes my website slow. It feels like when I switch from subpage to subpage, the code for getUser and getUserProjects runs repeatedly, slowing everything down (because page loads only after the data is fetched).
This problem does not occur in the ReactQuery, as the data is being refreshed in the background.
I also don't understand whether NextJS calls the database every time to refresh the User, even though middleware already does it on every page change? Maybe I'm missing something.
So, the questions are:
  1. How do you approach state management in which some Server and Client components without direct parent/child relationships require the same data?
  2. Is any of those patterns correct? If yes, which one?
  3. What are the advantages and disadvantages of those three options?
  4. Why does my app feel slow when I use plain getUser and getUserProjects in the server components? How does caching works? Is using ReactQuery the correct way of doing this?
  5. If ReactQuery is preferable to reduce the "slowness", should I create more Client components than Server Components?
  6. Is there any other way to do this? Am I missing something?

Here's the repo with example code: https://github.com/ky-zo/nextjs-app-router-state-mgmt

Penny for your thoughts!
Happy to learn from you. Cheers!
submitted by kzovk to nextjs [link] [comments]


2023.08.15 14:54 Mxfrj Project structure with database transactions

Hello,
I am currently trying to build a project with a better project structure, at the end I would love to use that "idea" in other projects too.In my current design the main problem is, that I always have to duplicate queries because I can't use transactions - when I need to use them. So if I need transactions I can't just call GetUser (for example) because I am not passing a database object into the function, I am using a global one. That means I can't start a transaction somewhere and pass it around, which I really would love to do. This would reduce a lot of duplicated queries when I need them inside a transaction.
So I looked at multiple projects and this one here is looking the best fit:
https://github.com/benbjohnson/wtf
They are starting a transaction and passing it inside the function, example:https://github.com/benbjohnson/wtf/blob/main/sqlite/auth.go#L25
So I tried to reproduce that, I am using sqlx if that matters. I sadly feel like my code is really ugly because of multiple .db.db calls and I am also unsure about the Queryable struct which isn't supporting all sqlx calls, I only added a few - that means I somehow need to add all of them?I understand how they remove one .db from the BeginTx function with implementing their own function (https://github.com/benbjohnson/wtf/blob/main/sqlite/sqlite.go#L193). They are doing that for every function, is that really a good style? I am just not sure if rewriting every function is a good habit(?). But here I am a bit clueless tbh :D
I added some comments inside the code so you directly see what I want to do. How can I improve that current layout? And yes, I could remove the domain layer and directly move to the json variant in GetUser for example. But I probably need them in the functions inside of GetUser.
https://pastebin.com/tfa8NC0i
I uploaded my code to pastebin, hope that is okay. If not I can also upload it somewhere else. Feel free to ask, not sure if everything makes sense. I also feel like passing the context around in every argument is wrong but there I am not sure. If I follow that /wtf project, I should have ctx as an argument everywhere. But they also pass it on that userservice so I don''t really know what to do and what is duplicated there.
Thanks
submitted by Mxfrj to golang [link] [comments]


2023.08.14 23:08 alexyakunin Poor gRPC performance on test - help needed

Hi, I'd love to get some help on why I'm seeing this on a small RPC performance benchmark that I recently wrote:
System-wide settings: Thread pool settings: 48+ worker, 48+ I/O threads ByteSerializer.Default: MessagePack Starting server @ https://localhost:22444/ Client settings: Server URL: https://localhost:22444/ Test plan: 5.00s warmup, 4 x 5.00s runs Total worker count: 14400 Client concurrency: 120 Client count: 120 Stl.Rpc: Sum : 3.31M 3.21M 3.17M 3.24M -> 3.31M calls/s GetUser : 2.66M 2.66M 2.62M 2.64M -> 2.66M calls/s SayHello : 1.75M 1.72M 1.73M 1.72M -> 1.75M calls/s SignalR: Sum : 2.73M 2.72M 2.67M 2.71M -> 2.73M calls/s <-- compare this GetUser : 2.36M 2.31M 2.35M 2.35M -> 2.36M calls/s SayHello : 1.21M 1.20M 1.18M 1.20M -> 1.21M calls/s gRPC: Sum : 116.31K 129.65K 125.22K 125.28K -> 129.65K calls/s <-- with gRPC GetUser : 124.57K 125.63K 126.02K 122.40K -> 126.02K calls/s SayHello : 119.60K 123.33K 122.29K 120.76K -> 123.33K calls/s HTTP: Sum : 134.79K 143.42K 148.22K 143.03K -> 148.22K calls/s <-- moreover, gRPC ~= HttpClient GetUser : 141.18K 144.50K 139.19K 145.96K -> 145.96K calls/s SayHello : 129.53K 132.30K 128.41K 130.47K -> 132.30K calls/s 
To run the test locally:
git clone git@github.com:servicetitan/Stl.Fusion.Samples.git cd Stl.Fusion.Samples dotnet run -c Release --project src/RpcBenchmark/RpcBenchmark.csproj -- test 
You can also run its server & client on different machines like this:
dotnet run -c Release --project src/RpcBenchmark/RpcBenchmark.csproj -- server https://0.0.0.0:8888/ dotnet run -c Release --project src/RpcBenchmark/RpcBenchmark.csproj -- client https://serverIPOrHostName:8888/ 
And two key options controlling the client are:
You can also run this test specifically for gRPC by using -b grpc - and e.g. in my case it produces max throughput (~ 180K calls/s) with these parameters:
dotnet run -c Release --project src/RpcBenchmark/RpcBenchmark.csproj -- test -w 10000 -cc 1000 -b grpc 
Some notes on the test itself:
Assuming you run this test locally, this is how you can tweak the settings for max throughput:
More details on why this specific test is important & why it's actually a real-life scenario as well: https://servicetitan.github.io/Stl.Fusion.Samples/#5-rpcbenchmark-sample
The feedback I'd love to get:
1) Is there any way to increase gRPC throughput on this test? Tweaking some parameters specifically for gRPC test (e.g. using -cc 1000 or so) can bump its throughput to ~ 180K, but it’s still quite far from SignalR.
2) Do you see any issues in test code - in particular, in server or client setup?
And if you don't see anything suspicious, do you feel that this is ~ an expected gRPC performance from your experience? My take on this is:
P.S. If you end up running the test, please share your results - esp. on any Unix. All the results above were produced on Windows 11.
submitted by alexyakunin to dotnet [link] [comments]


2023.05.17 18:05 Safe-Lander Can this approach be called "advanced" Authentication/Authorization?


This is from a tutorial I found on YT that calls this advanced Authentication/Authorization.
I wonder if this is considered advanced indeed or at the minimum better than storing JWTs in local storage?
user-controller.js ``` const brcypt = require("bcryptjs"); const jwt = require("jsonwebtoken"); const User = require("../model/User"); const signup = async (req, res, next) => { const { name, email, password } = req.body; let existingUser; try { existingUser = await User.findOne({ email: email }); } catch (error) { console.log("error: ", error); } if (existingUser) return res.status(400).json({ message: "User already exists" }); const hashedPassword = brcypt.hashSync(password); const user = new User({ name, email, password: hashedPassword, }); try { await user.save(); } catch (error) { console.log("error: ", error); } return res.status(201).json({ message: user }); }; const login = async (req, res, next) => { const { email, password } = req.body; let existingUser; try { existingUser = await User.findOne({ email: email }); } catch (error) { return new Error("error: ", err); } if (!existingUser) return res.status(400).json({ message: "User not found. Signup please" }); const isPasswordCorrect = brcypt.compareSync(password, existingUser.password); if (!isPasswordCorrect) return res.status(400).json({ message: "Invalid email or password" }); const token = jwt.sign({ id: existingUser._id }, process.env.JWT_SECRET_KEY, { expiresIn: "35s", }); if (req.cookies[`${existingUser._id}`]) { req.cookies[`${existingUser._id}`] = ""; } res.cookie(String(existingUser._id), token, { path: "/", expires: new Date(Date.now() + 1000 * 30), httpOnly: true, sameSite: "lax", secure: process.env.NODE_ENV, }); return res .status(200) .json({ message: "Successfully logged in.", user: existingUser, token }); }; const verifyToken = (req, res, next) => { const cookies = req.headers.cookie; // Gets cookies from header const token = cookies.split("=")[1]; if (!token) return res.status(404).json({ message: "Token not found" }); jwt.verify(String(token), process.env.JWT_SECRET_KEY, (error, user) => { if (error) return res.status(400).json({ message: "Invalid token" }); req.id = user.id; }); next(); }; const getUser = async (req, res, next) => { const userId = req.id; let user; try { user = await User.findById(userId, "-password"); // Send all the details of this User entry except the "password" field } catch (error) { return new Error(error); } if (!user) return res.status(404).json({ message: "User not found" }); return res.status(200).json({ user }); }; const refreshToken = (req, res, next) => { const cookies = req.headers.cookie; // Gets cookies from header const prevToken = cookies.split("=")[1]; if (!prevToken) return res.status(400).json({ message: "Couldn't find token" }); jwt.verify(String(prevToken), process.env.JWT_SECRET_KEY, (err, user) => { if (err) { console.log(err); return res.status(403).json({ message: "Authentication failed" }); } res.clearCookie(`${user.id}`); req.cookies[`${user.id}`] = ""; const token = jwt.sign({ id: user.id }, process.env.JWT_SECRET_KEY, { expiresIn: "35s", }); res.cookie(String(user.id), token, { path: "/", expires: new Date(Date.now() + 1000 * 30), // 30 seconds httpOnly: true, sameSite: "lax", secure: process.env.NODE_ENV, }); req.id = user.id; next(); }); }; const logout = (req, res, next) => { const cookies = req.headers.cookie; // Gets cookies from header const prevToken = cookies.split("=")[1]; if (!prevToken) return res.status(400).json({ message: "Couldn't find token" }); jwt.verify(String(prevToken), process.env.JWT_SECRET_KEY, (err, user) => { if (err) { console.log(err); return res.status(403).json({ message: "Authentication failed" }); } res.clearCookie(`${user.id}`); req.cookies[`${user.id}`] = ""; return res.status(200).json({ message: "Logged out successfully" }); }); }; exports.signup = signup; exports.login = login; exports.verifyToken = verifyToken; exports.getUser = getUser; exports.refreshToken = refreshToken; exports.logout = logout; ``` 
I've also added a limiter (it wasn't in the tutorial) to make it a bit more secure per ChatGPT's suggestion:
app.js ``` const express = require("express"); const mongoose = require("mongoose"); const router = require("./routes/user-routes"); const cookieParser = require("cookie-parser"); const cors = require("cors"); const rateLimit = require("express-rate-limit"); require("dotenv").config(); const app = express(); app.use(cors({ credentials: true, origin: "http://localhost:3000" })); app.use(cookieParser()); app.use(express.json()); // Apply rate limiting middleware const limiter = rateLimit({ windowMs: 15 * 60 * 1000, // 15 minutes max: 100, // Max requests per windowMs }); app.use(limiter); app.use("/api", router); mongoose .connect( `mongodb+srv://admin:${process.env.MONGODB_PASSWORD}@cluster0.nrlaqu0.mongodb.net/auth?retryWrites=true&w=majority` ) .then(() => { app.listen(5000); console.log("Database is connected! Listening to localhost 5000"); }) .catch((err) => console.log(err)); ``` 
submitted by Safe-Lander to FullStack [link] [comments]


2023.01.24 17:24 SeeDat_Ghai Anyone have a seller suggestion for this Air Max 97?

Anyone have a seller suggestion for this Air Max 97?
Hi all, looking for this particular air max 97 model
https://preview.redd.it/0bgixy67p0ea1.jpg?width=473&format=pjpg&auto=webp&s=b6990aff9560c5526e7659e3487b5d8b0d1550fa
it is from this link
Found something similar from lol2021 a previous post here but I think the link doesn't seem to be working anymore. Couldn't find it in his current air max inventory either.
Saw people saying UMKAO's max's aren't the best. I tried putting GETU's taobao link (https://shop114840549.taobao.com/) from advice from this post into pandabuy but it kept giving me an error

Tried putting in the image into pandabuy and there were fairly expensive reps ($70+) with a decent number of sales so just checking if there is a good and cheap option.
Thx
submitted by SeeDat_Ghai to FashionReps [link] [comments]


2023.01.22 08:36 RooCoder Redux API call with dynamic url paramter

Hi,
My API has a GET endpoint that requires each user's name to be appended at the end as a route parameter. www.myapi.com/getuse{username}
I am stumped how to access the username inside the action creator as useSelector(state => state.user) will not work. I get some "useContent()" error when I try it.
I can't pass in an additional parameter to the action creator as it is "async(req, res)". I don't really know too much about that function as I'm fairly new.
I'm thinking is there a way to pass the parameter inside the fetch query? res.parameter or something like that?
My action creator:
getUser.js export default async (req, res) => { if (req.method === 'GET') { const username = ?????????? try { // How to get username??? const apiRes = await fetch(`${API_URL}/GetUseusername`, { method: 'GET', headers: { 'Accept': 'application/json', } }); const data = await apiRes.json(); if (apiRes.status === 200) { return res.status(200).json({ user: data }); } else { return res.status(apiRes.status).json({ error: data.error }); } } catch(err) { return res.status(500).json({ error: 'Something went wrong when retrieving user' }); } } else { // Error. Not a GET request. They tried POST or PUT etc. res.setHeader('Allow', ['GET']); return res.status(405).json({ error: `Method ${req.method} not allowed` }); } }; 
My action:
// Get User Object export const load_user = () => async dispatch => { try { // Calls the above getUser.js const res = await fetch(`/api/getuser`, { method: 'GET', headers: { 'Accept': 'application/json', 'Content-Type': 'application/json', } }); const data = await res.json(); if (res.status === 200) { dispatch({ type: LOAD_USER_SUCCESS, payload: data }); } else { dispatch({ type: LOAD_USER_FAIL }); } } catch(err) { dispatch({ type: LOAD_USER_FAIL }); } }; 
submitted by RooCoder to reactjs [link] [comments]


2022.12.09 04:49 bugman195 How to form render array from multiple APIs in React

Situation

I need to form a comments timeline, within a single comment there's
  1. Comment text (from API 1)
  2. Comment timestamp (from API 1)
  3. Author Name (use data in API 1 to fetch for API 2)
  4. Author profile photo (use data in API 1 to fetch for API 2)

API Format

API 1. Comments API, an array of comments by chronological order json [ { comment: 'Hello, how's your day?', userId: '001', timestamp: '1670548338131' }, { comment: 'Pretty good!', userId: '002', timestamp: '1670548338151' }, { comment: 'Want to hang out later?', userId: '001', timestamp: '1670548338171' }, ... ] API 2. User info API, one user info per search
json { userId: '001', userName: 'Ben', userProfileUrl: 'https://www.photo.com/001' }

Questions

  1. What's the better way to call API 2 mutiple times?
  2. What's the better way to form the render array? (Want the highest efficiency)

Initial Idea

  1. Fetch comments list first, use a Set to collect unique userIds
  2. Use Promise.all to fetch API 2 parallelly with all the userIds within the Set
  3. Form a dictionary (map) to lookup user info by userId
  4. Iterate the list from step 1, fill in user info one by one, and produce a new list
  5. Set the final result list into useState state
  6. Render

Sample Code

``js const getComments = () => axios.get('API_1') const getUser = (userId) => axios.get(API_2/${userId}`)
const [renderList, setRenderList] = React.useState([])
const initComments = async () => { try { const res = await getComments() const rawComments = res.data const usersSet = new Set(rawComments.map(c => c.userId)) const promises = [...usersSet].map(u => getUser(u))
// Here's the question 1, is it viable? const usersInfo = await Promise.all(promises) const usersMap = usersInfo.reduce((prev, curr) => { return { ...prev, [curr.data.userId]: curr.data, } }, {}) // Here's the question 2, is it the most efficient way? const finalList = rawComments.map(c => { return { ...c, userName: usersMap[c.userId].userName, userProfileUrl: usersMap[c.userId].userProfileUrl, } }) setRenderList(finalList) 
} catch(err) {} }
React.useEffect(() => { initComments() }, [])
return renderList.map(item => <>{/* Do the render here */})
```
submitted by bugman195 to reactjs [link] [comments]


2022.12.02 16:26 TakenToTheRiver Status of All Software Updates is "Not Required" to 99% of Clients

Can't figure this one out. CM version = 2111 with all hotfixes installed.
Discovered recently that 99% of clients are not installing Software Updates because the updates' status is "Not Required."
I found these errors in the ScanAgent log, which are not in the logs on the 1% of clients without this issue:CScanAgentCache::PersistInstanceInCache- failed at PopulateWMIObjectFromInstance with Error=0x80041005
ScanJob({4BBC3ADE-5231-4DEB-844A-DD53552924B8}): CScanJob::Execute - SKIPPING SCAN and Using cached results, ScanType=2

I'm uncertain if it's related, but CcmMessaging.log shows these errors:Could not open registry key for user S-1-5-21-169686320-1948071156-1859928627-552409, 0x80070002
GetuserTokenFromSid, couldn't find logon session for user sid S-1-5-21-169686320-1948071156-1859928627-552409
Post to\`http:``//``/ccm_system/requestfailed with 0x87d00323.`

Also found this in LocationServices.log:
Error reading location override information from WMI. Return code: 0x80041010

I thought there might be a WMI issue (which would be odd on 99% of clients), so I looked in wmimgmt.msc > WMI Control > Properties. It connected successfully though, with no errors reported.

I found this MS support doc about testing Windows Update Agent for offline updates (Using WUA to Scan for Updates Offline - Win32 apps Microsoft Learn), and the scan results came back showing 3 applicable updates for an affected client (Win10 CU, .NET CU, and Malicious Software Removal update).
I also tried resetting WU using this script from MS (How to:Reset Windows update components in Windows - Microsoft Community), which made no difference.
Anyone have advice for how to proceed? At this point, I don't know if the issue lies on the client side or the CM side. TIA!

Edit: WUAHandler also shows:
OnSearchComplete - Failed to end search job. Error = 0x8024001e.
Scan failed with error = 0x8024001e.
submitted by TakenToTheRiver to SCCM [link] [comments]


2022.11.23 07:39 ErroneousLogik What is the best way to create a protected route with React Router v6.4 loaders?

Hi folks,
Recently been building a little personal blog using React and React Router v6.4 and having trouble trying to created a protected route using loaders.
In order to prevent an XY problem, what I'm looking to try and do is to prevent a route from loading if a user is not authorised and redirect them back to the homepage.
I have tried 2 options so far:
Lets say for example my index.js file contains a router as follows
const router = createBrowserRouter([ { path: "/", element: , errorElement: , children: [ { path: "new", element: , loader: NewPostLoader, }]}]) 
My associated loader function NewPostLoader is below
export async function NewPostLoader() { //I want to check if user is authorised here using an api call and if not return redirect("/"); let ret = {}; ret['tags'] = await getTagsAll(); return ret } 
In the loader function above, my issue is not the api call but rather how I can get a token from the current App state to pass into the api call.
Any help is much appreciated.
submitted by ErroneousLogik to reactjs [link] [comments]


2022.09.23 19:19 witheredartery How to fix the issue of UseEffect not triggering on route change even after feeding in Params Object's property as dependency?

Here's my App.js component
App.js

function App() { return (    
} /> } /> } /> {/* ISSUE */} } />
); }
The route change to the user component happens in Home Component, I search for a github profile, a lift of profiles are displayed, I click on the visit profile button on the profile card, which Links me to User component
UserItem.js:

import { Link } from 'react-router-dom' import PropTypes from 'prop-types' const UserItem = ({user: {login, avatar_url}}) => { return ( 
Profile

{login}

Visit Profile {/* TRIGGERING HERE!!!!!!!!!!! */}
) } UserItem.propTypes = { user: PropTypes.object.isRequired, } export default UserItem;
My user Component where the function getUser needs to be triggered-> The function getUsers works fine as when i used it outside an useEffect, it did give me the right response from the fetch call inside.
User.js

import React, { useEffect, useContext } from 'react' import {FaCodepen, FaStore, FaUserFriends, FaUsers} from 'react-icons/fa' import { Link } from 'react-router-dom' import GithubContext from '../context/github/GithubContext' import { useParams } from 'react-router-dom' import Spinner from '../components/layout/Spinner' import RepoList from '../components/repos/RepoList' export const User = () => { const params = useParams() const { getUser, user, loading, getUserRepos, repos } = useContext(GithubContext) console.log("params", params, "login", params.login) // WORKS FINE useEffect ( () => { console.log("use-effect", getUser, "", params.login) getUser(params.login) getUserRepos(params.login) },[params.login]) // DOESNT TRIGGER WITH [] or [params.login] console.log( "user-check",user) // user object empty const { name, type, avatar_url, location, bio, blog, twitter_username, login, html_url, followers, following, public_repos, public_gists, hireable, } = user console.log("YES THIS PAGE") // Here I was checking whether page is being mounted and YES it is being mounted const websiteUrl = blog?.startsWith('http') ? blog : 'https://' + blog if(loading){ return  } return (<> TYPICAL INFORMATION DISPLAY UI ) } 
submitted by witheredartery to reactjs [link] [comments]


2022.09.19 15:56 TheLastSamuraiOf2019 Cannot access VM - Newbie

I created a VM in the Google cloud. Loaded it with a docker image that contains some basic CRUD Rest API endpoints. The Go executable in the container listens at port 8080. The container worked well on my local machine. I was able to access all endpoints using https://localhost:8080/
I also created a firewall rule to open up all ports. However, I can't access the API endpoint. https://32.12.140.45:8080/api/getuser . I get a message that the page didn't respond. I started the container explicitly too using -p 8080:8080 but that also did not help. I do not see any incoming connections in the logs.
How should I go about troubleshooting? Any advice?
submitted by TheLastSamuraiOf2019 to googlecloud [link] [comments]


2022.09.19 13:31 p90fans CORS error: No 'Access-Control-Allow-Origin' header is present, while I already specified the origin in the server.

CORS error: No 'Access-Control-Allow-Origin' header is present, while I already specified the origin in the server.
Hi all, I have read the doc from MDN, the post from SO, and this sub, however after many hours of struggling I want to get some idea here.
From what I understand, in order for my React frontend (in Vercel), to use API my Backend (in Heroku), I need to have my backend send a header of "Access-Control-Allow-Origin" to the frontend.
Using this knowledge and ExpressJS with CORS library, the frontend successfully uses the API for the backend's database, however, it fails to use the API whose purpose is to receive user info from the server. In Developer Tool, the failed API just doesn't receive response headers that include Access-Control-Allow-Origin, and the console says origin (frontend domain) has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
To illustrate, the below screenshot is Developer Tools> Network, it is a 503 code.
https://preview.redd.it/zwt79o46fto91.png?width=1959&format=png&auto=webp&s=9aa3ede4440e5b4696c01737d4f7bf9e96f99ea0
I suspect that it is because it involves some cookie credentials, and the MDN doc mentioned that credentials may affect CORS.
Below is the frontend code (my apology that Inline Code does not work properly):
const getUser = async () => { fetch(`${process.env.REACT_APP_BACKEND_URL}/auth/login/success`'/auth/login/success', { method: 'GET', credentials: 'include', headers: { Accept: 'application/json', 'Content-Type': 'application/json', 'Access-Control-Allow-Credentials': true, 'Access-Control-Allow-Origin': 'https://xxx.vercel.app/' }, mode:'cors' }) 
Below is the backend code
const router = require('express').Router(); const cors = require('cors') router.get('/login/success', cors({ origin: [process.env.FRONTEND_URL, 'http://localhost:3000'], methods: 'GET, POST, PUT,DELETE', credentials: true, }), async (req, res) => { console.log('REQ.USER:LOGIN/SUCCESS', req.user); 
Any idea? : (
Update: I just wonder if I am not stating the actual problem, because in Network, there are two names both named "success", one is have no problem in cors , another one is not working, see below:

https://preview.redd.it/29uqvbjshto91.png?width=2137&format=png&auto=webp&s=7e64c74921af5f3ca6fde1d729435e5038338977

https://preview.redd.it/d0w6vohthto91.png?width=1999&format=png&auto=webp&s=9dc388c6094a6cb9a77eaafdbb8082abb2a4026b
submitted by p90fans to webdev [link] [comments]


2022.09.09 22:47 warpanomaly PUT request works in Postman but not a deno typescript file as a fetch request

PUT request works in Postman but not a deno typescript file as a fetch request
I can do a put request just fine in Postman.
https://preview.redd.it/kiublse1bwm91.png?width=1728&format=png&auto=webp&s=e90a40c91c3a11f228e13ae3534a71d291422d4d

But when I try to do the same put request in a deno fresh app through a fetch command like this:
async function sendSignature() { const signer = provider.getSigner(); const nonce = "asdf123492fd"; const signatureSigned = await signer.signMessage(nonce); const headers = new Headers({ 'Content-Type': 'application/json' }); const opts = { method: 'PUT', headers: headers, body: JSON.stringify({ key: props.singleUser, wallet_address: walletAddrs, signature: signatureSigned }) } console.log(opts.body); const rawPosts = await fetch('http://localhost:4000/users/kythis1', opts); console.log(rawPosts); } 
Btw all of the values are being populated in body. I can confirm that key, wallet_address, and signature are not null and are strings. It fails though... Here's what the browser's console looks like.
https://preview.redd.it/hqn0scg4bwm91.png?width=1382&format=png&auto=webp&s=87bf4e8b250591f476c4bb61106aa57d41ed2a08

This is the entry point for the backend oak (deno's version of express) server.
import { Application } from "https://deno.land/x/oak/mod.ts"; import { APP_HOST, APP_PORT } from "./config.js"; import router from "./routes.js"; import _404 from "./controllers/404.js"; import errorHandler from "./controllers/errorHandler.js"; import { oakCors } from "https://deno.land/x/cors/mod.ts"; const app = new Application(); app.use(oakCors()); // Enable CORS for All Routes app.use(errorHandler); app.use(router.routes()); app.use(router.allowedMethods()); app.use(_404); console.log(`Listening on port:${APP_PORT}...`); await app.listen(`${APP_HOST}:${APP_PORT}`); 
This is the function that is getting called by the put request:
import User from "../db/database.js"; export default async ({ request, response }) => { if (!request.hasBody) { response.status = 400; response.body = { msg: "Invalid user data" }; return; } const body = request.body(); const { key, wallet_address, signature } = JSON.parse(await body.value); console.log(signature); if (!key) { response.status = 422; response.body = { msg: "Incorrect user data. Email and key are required" }; return; } const foundUser = await User.where('key', '=', key).first(); if (!foundUser) { response.status = 404; response.body = { msg: `User with key ${key} not found` }; return; } foundUser.wallet_address = wallet_address; foundUser.updated_at = new Date(); const updatedResp = await foundUser.update(); response.body = { msg: "User updated", updatedResp }; }; 
Finally this is the backend routes:
import { Router } from "https://deno.land/x/oak/mod.ts"; import getUsers from "./controllers/getUsers.js"; import getUserDetails from "./controllers/getUserDetails.js"; import createUser from "./controllers/createUser.js"; import updateUser from "./controllers/updateUser.js"; //import deleteUser from "./controllers/deleteUser.js"; const router = new Router(); router .get("/users", getUsers) .get("/users/:key", getUserDetails) .post("/users", createUser) .put("/users/:key", updateUser); //.delete("/users/:id", deleteUser); export default router; 
So why can I successfully call this function with a Postman put request, but I can't do a successful put request with fetch through a typescript file?
submitted by warpanomaly to learnjavascript [link] [comments]


2022.06.10 18:23 afoshx is this a good typescript codebase style?

good evening everyone, I hope you are having a good day.
lately I have been noticing couple things about javascript/typescript projects as a whole.
most codebases that I see around github, Youtube, and even on blogs, are using the (Router) approach, that's what I refer to where I see the usual
const getUsers = async(req: Request, res:Response) => { const { title, description, city, } = req.body; const { id } = req.params; const doc = await doc.findById(id); if (!program) { return handleResponse(res, useWord("resourceNotFound", req.lang), 404); } doc.isPublished = isPublished; await doc.save(); } 
i see this part so often, is this really the right way tho ?
because i can clearly see all of this
// lack of (unity) idk if that's good or not but ill explain more below #1 const getUsers = async(req: Request, res:Response) => { const { title, description, city, } = req.body; // router related const { id } = req.params; // router related // DB related const doc = await doc.findById(id); // db related if (!program) { return handleResponse(res, useWord("resourceNotFound", req.lang), 404); } // router related doc.isPublished = isPublished; // business logic await doc.save(); // db related } 
#1 - this means in the future, the logic related to getUser is bound to be repeated at some place, because now the business logic is bound to the express router, if i wanted to have the login feature anywhere else, ill either have to copy this code and repeat it somewhere else, or separate it into a function that's responsible for the business logic

is this function now testable? this function is responsible for currently 3 things
router logic, business logic and db logic
also i have noticed couple things,
most of the typescript projects are 80% functions are functions that change something on an object.
i am not necessarily saying that's a bad thing, but i have realized that coding in that way causes you to treat objects just as collection of properties, instead of an object that has properties and methods, and that leads to many repeated code, ill explain it below in code
const post = { title: "this is a posttitle", body: "this is a post body", isApproved: false, approvedBy: null, } function approvePost ( post, approvedBy ) { const sanitary = checkIfBodyHasBadWords(post.body); if(!sanitary) return false; post.isApproved = true; post.approvedBy = approvedBy; return true; } function checkIfBodyHasBadWords(body) { //complex sanitary computation } 
couldnt i have done it like this ?
const post = { title: "this is a posttitle", body: "this is a post body", isApproved: false, approvedBy: null, approvePost(approvedBy){ const sanitary = checkIfBodyHasBadWords(post.body); if(!sanitary) return false; this.isApproved = true; this.approvedBy = approvedBy; return true; } } post.approvePost(admin); 
i still actually have no frame of reference of what is "bad" and what is "good" typescript code, so typescript gods please have mercy on my ignorance, am trying to learn :)
also, if you have time, please check this code, and let me know what you feel about it, i have seen it on a youtube channel the other day, and i absolutely loved it
its heavily focused on the classes approach
https://github.com/afosh/typescript-class-approach
thank you.
submitted by afoshx to typescript [link] [comments]


2022.02.27 05:54 bottledpee Hi! i'm new to laravel and I'm trying to setup a new project for our company. I have a few questions regarding routing, type-hints, and route control groups and others. Please help!

Here's what I learned so far regarding the different syntaxes for writing routes:
Syntax 1: *simple get without passing any parameters.
Route::get('/users', 'App\Http\Controllers\UserController@getUsers'); 
Syntax 2: *simple get but with a some kind of anonymous function
Route::get('/users', function () { //How do I specify that i'm trying to call the getUsers function here? cause this uses an anonymous function //because unlike syntax 1 and 4, the getUsers function is not specified in this syntax. }); 
Syntax 3: *get request with type-hint/dependency injection as specified here: https://laravel.com/docs/9.x/routing#dependency-injection
Route::get('/users', function (GetUser $request) { //How do I call the getUsers function here? //because unlike syntax 1 and 4, the getUsers function is not specified in this syntax. }); 
Syntax 4:
Route::get('/users', [UserController::class, 'getUsers']); //How do I use type hint here? Is it possible to specify here that //the getUsers function requires a GetUser FormRequest same way with syntax 3? 
Syntax 5: *I prefer to use this one because our project will be having a ton controllers but I don't know how to use type-hints/dependency injection here.
Route::controller(UserController::class)->group(function () { Route::get('/users', 'getUsers'); //how do I use type-hints here same as syntax 3. }); 
As you can see I'm having trouble understanding the different syntaxes and how to write and use them properly.
What I'm trying to do is to write a route that:
  1. defines the function to be called like syntax 4 and 5.
  2. but it Must use type-hints
  3. and also, Must be inside a route control group.
Please be patient with me I probably have the wrong mindset here or understanding of routes.
submitted by bottledpee to laravel [link] [comments]


2022.02.27 05:46 bottledpee Hi! i'm new to laravel and I have a few questions regarding routing, type-hints, and route control groups and others. Please help!

Here's what I learned so far regarding the different syntaxes for writing routes:
Syntax 1: *simple get without passing any parameters.
Route::get('/users', 'App\Http\Controllers\UserController@getUsers'); 
Syntax 2: *simple get but with a some kind of anonymous function
Route::get('/users', function () { //How do I specify that i'm trying to call the getUsers function here? cause this uses an anonymous function //because unlike syntax 1 and 4, the getUsers function is not specified in this syntax. }); 
Syntax 3: *get request with type-hint/dependency injection as specified here: https://laravel.com/docs/9.x/routing#dependency-injection
Route::get('/users', function (GetUser $request) { //How do I call the getUsers function here? //because unlike syntax 1 and 4, the getUsers function is not specified in this syntax. }); 
Syntax 4:
Route::get('/users', [UserController::class, 'getUsers']); //How do I use type hint here? Is it possible to specify here that //the getUsers function requires a GetUser FormRequest same way with syntax 3? 
Syntax 5: *I prefer to use this one because our project will be having a ton controllers but I don't know how to use type-hints/dependency injection here.
Route::controller(UserController::class)->group(function () { Route::get('/users', 'getUsers'); //how do I use type-hints here same as syntax 3. }); 
As you can see I'm having trouble understanding the different syntaxes.
What I'm trying to do is to write a route that:
  1. defines the function to be called like syntax 4 and 5.
  2. Must use type-hints
  3. Must be inside a route control group.
Please be patient with me I probably have the wrong mindset here or understanding of routes.
submitted by bottledpee to PHPhelp [link] [comments]


http://rodzice.org/