F5F Stay Refreshed Software Operating Systems Microsoft - Antitrust Measures: Sherman and Robinson Acts

Microsoft - Antitrust Measures: Sherman and Robinson Acts

Microsoft - Antitrust Measures: Sherman and Robinson Acts

H
Highlighting
Member
153
02-07-2016, 01:05 AM
#1
Hello everyone, I'm sharing this because I'm unsure where else it will get the right attention. I'll keep it brief. On Jan 15th, 2019, my network was hacked and two systems I used for virtualization with the Microsoft Deployment Toolkit were compromised. I had to reset both machines and start fresh. Ten days later, I uploaded a video at 8:30AM on Jan 25th, 2019. The next day, project engineer Michael Niehaus posted an update about MDT version 8456. Interestingly, he mentioned Johan Arwidmark had replied about a missing ZTIGather.xml file from Neihaus's work. It seemed suspicious since they were trying to fix the script and hadn't shared it yet. I eventually got frustrated and told them I wanted to rewrite their guide because their process was slow and used unnecessary tools. After some time, I realized maybe it wasn't my place to do their job. Eventually, on Sunday, I posted on Github that the delay felt deliberate. I suspected the module they relied on contained malicious code in every Windows system—DVDs, USBs, servers, or SCCM. I thought they might be trying to hide something. Five minutes later, the project was closed. What I did was back up the repo and tried the process again. Eventually, I discovered the BDD.Core module used by SCCM and MDT actually deployed a Win32 virus—the Customer Improvement Experience Program. This malware appeared in every Microsoft service version. After rebooting my router, I found a CVE report linking it to time.windows.com and my original network box. Why? Because they were the first to face this issue. They seemed to be trying to shift blame while violating privacy and security. Once I restarted, I uncovered evidence pointing to their involvement. This experience made me question their intentions and the trust we place in them.
H
Highlighting
02-07-2016, 01:05 AM #1

Hello everyone, I'm sharing this because I'm unsure where else it will get the right attention. I'll keep it brief. On Jan 15th, 2019, my network was hacked and two systems I used for virtualization with the Microsoft Deployment Toolkit were compromised. I had to reset both machines and start fresh. Ten days later, I uploaded a video at 8:30AM on Jan 25th, 2019. The next day, project engineer Michael Niehaus posted an update about MDT version 8456. Interestingly, he mentioned Johan Arwidmark had replied about a missing ZTIGather.xml file from Neihaus's work. It seemed suspicious since they were trying to fix the script and hadn't shared it yet. I eventually got frustrated and told them I wanted to rewrite their guide because their process was slow and used unnecessary tools. After some time, I realized maybe it wasn't my place to do their job. Eventually, on Sunday, I posted on Github that the delay felt deliberate. I suspected the module they relied on contained malicious code in every Windows system—DVDs, USBs, servers, or SCCM. I thought they might be trying to hide something. Five minutes later, the project was closed. What I did was back up the repo and tried the process again. Eventually, I discovered the BDD.Core module used by SCCM and MDT actually deployed a Win32 virus—the Customer Improvement Experience Program. This malware appeared in every Microsoft service version. After rebooting my router, I found a CVE report linking it to time.windows.com and my original network box. Why? Because they were the first to face this issue. They seemed to be trying to shift blame while violating privacy and security. Once I restarted, I uncovered evidence pointing to their involvement. This experience made me question their intentions and the trust we place in them.

B
63
02-07-2016, 07:59 AM
#2
Microsoft and Windows have major issues. This article will reach Barnacules.
B
bunnywithabowl
02-07-2016, 07:59 AM #2

Microsoft and Windows have major issues. This article will reach Barnacules.

N
Necron65
Member
205
02-13-2016, 12:27 PM
#3
I focused on turning a target OS into instruction formats, essentially converting raw data into vectorized instructions. The goal was a perfect 1:1 match for the transmitted information, which could shrink data usage by up to 100,000 times since instructions are much smaller. If the right tools exist, I believe we can achieve better results than relying on services like AWS, where all input is copied and stored for analysis. This approach could also help generate more revenue through MS-related services.
N
Necron65
02-13-2016, 12:27 PM #3

I focused on turning a target OS into instruction formats, essentially converting raw data into vectorized instructions. The goal was a perfect 1:1 match for the transmitted information, which could shrink data usage by up to 100,000 times since instructions are much smaller. If the right tools exist, I believe we can achieve better results than relying on services like AWS, where all input is copied and stored for analysis. This approach could also help generate more revenue through MS-related services.

F
FamusLuna
Member
202
02-13-2016, 06:48 PM
#4
It reflects the same concept Google explored with ChromeBook technology, though it came a bit too early and was confined to browser and OS settings. That's why I've been focusing on a deployment solution that avoids static image copying or relying on fixed image templates for editing. I aim to create a hybrid approach that sits between a live file system and traditional operating systems—using hash tables instead of regular paths and symlinks. This way, I could match the performance of services like AWS while using significantly less bandwidth.
F
FamusLuna
02-13-2016, 06:48 PM #4

It reflects the same concept Google explored with ChromeBook technology, though it came a bit too early and was confined to browser and OS settings. That's why I've been focusing on a deployment solution that avoids static image copying or relying on fixed image templates for editing. I aim to create a hybrid approach that sits between a live file system and traditional operating systems—using hash tables instead of regular paths and symlinks. This way, I could match the performance of services like AWS while using significantly less bandwidth.