Nope. I don’t talk about myself like that.

  • 1 Post
  • 1.02K Comments
Joined 1 年前
cake
Cake day: 2023年6月8日

help-circle




  • Do you think a house gets sold every day,

    Please point me to where I said “every day”. Pretty sure I said “less than 1 a month”. Far from “every day”.


    or that everyone ends up buying?

    Uh…

    They probably had 50 other clients also during that time. My point with that statement was originally that the “hours of work” you think they did for you… They weren’t working just for you. And the one sale that took you months… They likely had several other sales in that same period of time. Agents that I’ve worked with in the past were working with upwards of 20-30 showings at any given week. If only 1% of that yields a sale then it’s about (gasp) ~1 sale a month! [52*~25*.01=13]


    Or that they even get that $15k as take home?

    Never said take home either. But if I make 1 sale a month… and I get 15k out of that sale… and after everything is done, walk away with HALF… 7.5*12 = 90k… That’s STILL MORE THAN MOST PEOPLE. My point has literally not even close to changed.


    On average, most agents are making as little as $60k/yr and as high as $100k/year, region dependant.

    And? If your pulling those numbers from BLS or similar sources… You’re missing the fact that all of these realtors work under LLCs and that the 60-100k is take home. Which ignores that the LLC owns the car they drive, the cell they use, the miles they drive, etc… My own LLC does the same thing. I take home 80k, my company pulls in 160k, but a good chunk goes to operating costs and the rest sits in the company to grow it(or if contracts dry up, the continue operating until I can obtain new ones). They’re doing the same shit. They have access to the added funds.


    Above that $100k mark is the top quartile, and above $130k-ish is going to be the top 10%.

    Duh? Those that make more than one sale a month! Almost like I already addressed this.


    It’s not as lucrative as you think it is.

    It’s exactly as lucrative as I think it is. And all your “Arguments” to change my opinion fall flat at best. Show me realtors who only make 1% or less per home they sell (keep in mind that this is still “JUST” $4,951 per home on average) and I’ll shut the fuck up. Until then you’re wrong. And if you’re going to tell me that I’ve said something I didn’t say again… I’m just going to block you.

    And BTW…

    https://www.homelight.com/blog/how-many-homes-does-a-realtor-sell-a-year/

    According to NAR, Realtors completed a median of 12 residential “transaction sides” in 2022. Keep in mind that transaction sides are not a strict measure of homes sold. An agent earns a transaction side when they help either a buyer or a seller close a sale.

    So my “One sale a month” was dead on accurate.

    And it also turns out in that same link… the BLS data would be severely tainted by one simple fact…

    Part-timers and hobbyists sell fewer homes
    Compared to the high flying agents on reality TV shows, it might surprise you to learn that Realtors worked a median 30 hours per week in 2022, according to the NAR, and made a median gross income of $56,400. However, these figures also account for Realtors who don’t pursue real estate as a full-time job.


  • The average home price in the United States was $495,100 in the second quarter of 2023

    3% (as customary is 6%, split evenly to each realtor) of 485100 is about 15k

    Why? Mine did a shit ton of work for me and fought the sellers to get them to do the shit the appraiser wanted done.

    So a few hours of emails and phone calls is somehow $15k worth of work?

    As well as working for almost a year looking at different houses before we finally found the right one

    They probably had 50 other clients also during that time. Some of them pan out and is the 15k payday. Ultimately they only need to sell (or buy) like 10 houses a year to make a better living than you likely have. Less than 1 a month.





  • Saik0@lemmy.saik0.comtoLemmy Shitpost@lemmy.worldIroning
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    12 天前

    Nothing about their comment outlined that they didn’t know how to use it. But that they never did use it.

    Nothing about their comment eludes to any fact that they’re embarrassed at all. They posted it publicly and directly with not reservation which is the opposite of “embarrassed”.

    They didn’t blame anyone for anything related to the iron itself. But for shallow intentions if they care at all about the clothes that they wear. Which I can understand and agree with to some extent.

    You on the other hand… You’re a jackass. Lots of insinuations, lots of assumptions. Just to put down some random person on the internet for not wanting a fucking iron that probably was the 20$ special at wally world.


  • Nope. There is an industry standard way of measuring latency, and it’s measured at the halfway point of drawing the image.

    No. And if you want to actually provide a link to your “industry standard” feel free to, just make sure that your “standard” actually can be applied to a CRT first.

    You can literally focus the CRT to only show one pixel (more accurately beam width) worth of value. And that pixel would be updated many thousands of times a second (literally constant… since it’s analog).

    If you’re going to define latency as “drawing the image” (by any part of the metric) then a CRT can draw a single “pixel” worth of value 1000s of times a second… probably more. Where your standard 60hz panel can only do 1/60th a second… (or even the highest LCDs at 1/365).

    If there is a frame to draw and that frame is being processed, then yes. You’re right. Measuring at the middle will yield a delay. But this isn’t how all games/operations work for devices in all of history. There are many applications where data being sent to the display is literally read from memory nanoseconds prior. CRTs have NO processing delay that LCDs do have.

    Further points of failure in your post. CRTs are not all “NTSC” standard (Virtually every computer monitor for instance). There’s plenty of CRTs that can push much higher than the NTSC standard specifies.

    Here’s an example from a bog standard monitor I had a long time ago… https://www.manualslib.com/products/Sony-Trinitron-Cpd-E430-364427.html

    800 x 600/155 Hz
    1024 x 768/121 Hz
    1280 x 1024/91 Hz
    1600 x 1200/78 Hz

    So on a 60hz LCD will always be 0.016 to do the whole image. Regardless of it’s resolution being displayed. Not so on the CRT… Higher performance CRTs can draw more “pixels” per second. and when you lower the amount of lines you want it to display the full frame draw times go down substantially. There’s a lot of ways to define these things, that your simplistic view doesn’t account for. The reality is though, it’s possible if you skip the idea of a “frame” that the time from input to the time of display on the CRT monitor is lower simply because there’s no processing occurring here, your limit is physics of the materials you build the monitor out of. Not some chips capability to decode a frame. thus… No latency.

    Not frametime. Not FPS. Not Hz. Latency is NONE of those things, otherwise we wouldn’t have those other terms and would have strictly used “latency” instead.

    And a wonderful example of this is the commodor64 tape loading screens. https://youtube.com/watch?v=Swd2qFZz98U

    Those lines/colors are drawn straight from memory without the concept of a frame. There is no latency here. Many scene demos abused this function to achieve really wild affects as well. Your LCD cannot do that, those demos don’t function correctly on LCDs…

    Lightguns are a perfect example of how this can be leveraged (which is completely impossible on an LCD as well).

    Specifically scroll down to the Sega section. https://www.retrorgb.com/yes-you-can-use-lightguns-on-lcds-sometimes.html

    By timing the click of the lightgun input to which pixel is currently being drawn by the frame to take that as input for the gun. That requires minimal latency to do. LCDs cant do that.

    Ultimately people like you are trying to redefine what latency is that flies in the face of actual history that shows us there is a distinct difference that has historically mattered and even applications of that latency that CANNOT be what you’re claiming it to be.

    https://yt.saik0.com/watch?v=llGzvCaw62Y#player-container

    can you tell me why the LCD on the right is ALWAYS behind? And why it will ALWAYS be the case that it will not work, regardless of how fast the LCD panel is? The reason you’re going to come to is that it’s processing delay. Which didn’t exist on CRTs. That’s “LATENCY”.

    When talking about retro consoles, we’re limited by the hardware feeding the display, and the frame can’t start drawing until the console has transmitted everything.

    This is where you’re completely wrong. CRTs don’t know the concept of a frame. It draws the input that it gets. Period. There’s no buffer… there’s no where to hold onto anything that is being transmitted. It’s literally just spewing electrons at the phosphors.

    Edit: typo

    Edit2: to expound on the LCD vs CRT thing with light guns. CRTs drawn the “frame” as it’s received… so as it gets the voltage it varies that voltage on the electron gun itself, which means that when the Sega console in this case sets the video buffer to the white value for a coordinate and displays it, it knows exactly which pixel is currently being modified. The LCD will take the input, store it in a buffer until it gets the full frame. Then display. The Sega doesn’t know when that frame will actually be displayed as there’s other shit between it and the display mechanism doing stuff. There is an innate delay that MUST occur on the LCD that simply doesn’t on the CRT. That’s the latency.



  • Absurdly safe.

    Proxmox cluster, HA active. Ceph for live data. Truenas for long term/slow data.

    About 600 pounds of batteries at the bottom of the rack to weather short power outages (up to 5 hours). 2 dedicated breakers on different phases of power.

    Dual/stacked switches with lacp’d connections that must be on both switches (one switch dies? Who cares). Dual firewalls with Carp ACTIVE/ACTIVE connection…

    Basically everything is as redundant as it can be aside from one power source into the house… and one internet connection into the house. My “single point of failures” are all outside of my hands… and are all mitigated/risk assessed down.

    I do not use cloud anything… to put even 1/10th of my shit onto the cloud it’s thousands a month.






  • you would realise security even without the cloud is critical to protecting systems

    Wazuh, the software I specifically called out. Is not “cloud”. They offer a cloud service, yes (that’s how they make money, on lazy admins or orgs that are too small to house their own infra). But it is self-hosted and designed to be run within the network.

    You clearly have no idea what the current security market looks like. Nor what half of the terms you use actually mean.

    Edit: Forgot to address this too

    Virtualising every single system endpoint is practically impossible, which Wazuh seems to rely on.

    No. The agent can be installed on ANY system. They recommend you install the orchestration/control node virtualized, which you don’t have to do. You can install it on a raw system though that would be a huge waste of resources. You seem to have missed that.


  • It is clear what you engaged in was attempting to malign all Lemmy.ml and lemmygrad.ml users

    By pointing out the correct answer to a persons question?

    Are you okay? You realize that my answer was basically the same as the other answer given by the lemmy.ml user in a different part of the thread. Just not an essay’s worth of content when a sentence is sufficient.

    You are a piece of shit. If Kiwifarms goes after people like you

    So a call to action to dox people? Why are you threatening people and calling them names? Aren’t you a mod? I mean you might have a case or argument if the votes weren’t kept on the platform itself.


  • The latter is beyond lacking in open source ecosystem

    And yet software like Wazuh (https://github.com/wazuh) exist… Which are complete SIEM and XDR platform. Which does more than any antivirus could ever dream to do. But somehow OSS security is lacking? Sounds like you haven’t looked at the security field seriously in decades. Kaspersky doesn’t lead the pack in anything and it isn’t in a “level field”. Quite the contrary Antivirus as a concept has been commodified in IT. They’re all generally drop in replacements for each other and are not what is actually used to prove to security auditors that systems are secure. You may get %1 detection differences between platforms or maybe an update 30 minutes or an hour earlier. This is generally meaningless and the modern tools actually used to prove security go way deeper than an antivirus.

    Lying to yourself is never going to solve problems.

    Seems to work for you though?