How to Edit The Directions For Using And Submitting The Electronic Self-Study and make a signature Online
Start on editing, signing and sharing your Directions For Using And Submitting The Electronic Self-Study online refering to these easy steps:
- Push the Get Form or Get Form Now button on the current page to jump to the PDF editor.
- Wait for a moment before the Directions For Using And Submitting The Electronic Self-Study is loaded
- Use the tools in the top toolbar to edit the file, and the added content will be saved automatically
- Download your completed file.
The best-rated Tool to Edit and Sign the Directions For Using And Submitting The Electronic Self-Study


Start editing a Directions For Using And Submitting The Electronic Self-Study right now
Get FormA quick tutorial on editing Directions For Using And Submitting The Electronic Self-Study Online
It has become quite easy lately to edit your PDF files online, and CocoDoc is the best free tool you have ever used to make some changes to your file and save it. Follow our simple tutorial to start!
- Click the Get Form or Get Form Now button on the current page to start modifying your PDF
- Add, change or delete your text using the editing tools on the toolbar on the top.
- Affter altering your content, put on the date and make a signature to complete it.
- Go over it agian your form before you click on the button to download it
How to add a signature on your Directions For Using And Submitting The Electronic Self-Study
Though most people are adapted to signing paper documents by handwriting, electronic signatures are becoming more general, follow these steps to add a signature for free!
- Click the Get Form or Get Form Now button to begin editing on Directions For Using And Submitting The Electronic Self-Study in CocoDoc PDF editor.
- Click on the Sign tool in the tool box on the top
- A window will pop up, click Add new signature button and you'll have three ways—Type, Draw, and Upload. Once you're done, click the Save button.
- Drag, resize and settle the signature inside your PDF file
How to add a textbox on your Directions For Using And Submitting The Electronic Self-Study
If you have the need to add a text box on your PDF so you can customize your special content, do the following steps to accomplish it.
- Open the PDF file in CocoDoc PDF editor.
- Click Text Box on the top toolbar and move your mouse to position it wherever you want to put it.
- Write in the text you need to insert. After you’ve inserted the text, you can use the text editing tools to resize, color or bold the text.
- When you're done, click OK to save it. If you’re not happy with the text, click on the trash can icon to delete it and start afresh.
A quick guide to Edit Your Directions For Using And Submitting The Electronic Self-Study on G Suite
If you are looking about for a solution for PDF editing on G suite, CocoDoc PDF editor is a recommended tool that can be used directly from Google Drive to create or edit files.
- Find CocoDoc PDF editor and install the add-on for google drive.
- Right-click on a PDF document in your Google Drive and select Open With.
- Select CocoDoc PDF on the popup list to open your file with and allow CocoDoc to access your google account.
- Modify PDF documents, adding text, images, editing existing text, mark up in highlight, fullly polish the texts in CocoDoc PDF editor before saving and downloading it.
PDF Editor FAQ
How do I participate in Kaun Banega Crorepati?
I would like to begin with a little disclaimer. This is a detailed post about the process that I went through for making it to the hot seat. The idea is to cover as many details as possible to clear most of the genuine queries I have received about the process.This answer is going to cover the broad steps I went through till I reached the hot seat, as objectively as possible. Also, please keep in mind that in 2020 the show was shot under extraordinary conditions, so a lot must be different from their usual process.Is KBC scripted?As I mentioned in more details in a previous answer, the only scripted bits I directly observed during the show were the handing over of the 3,20,000 cheque, the full amount online transfer right after a contestant finishes the game, and the choice of Flip The Question Category.Unlike the popular conspiracy theory, KBC candidate selection isn't fixed/scripted. The contestants at any stage are not aware of the GK questions that will be asked to them. I was never asked to fake a story about myself. Additionally, while I was encouraged to be candid, the team respected my privacy.The First Step: RegistrationThe registrations for KBC 2020 began on 9 May 2020 and went on till 22 May. A Simple GK question was asked each day which could be answered via SMS or Sony LIV App. I used the app to register myself.I vaguely remember sharing a few personal details including my name, age, gender, my State, phone number and e mail while registering on Day 1 before submitting the answer to the question of the day.That's all. You register with your details, answer the question of the day till the registration is open and keep your fingers crossed for the randomiser to select you for the next step. This is probably the easiest step in the whole process in terms of merit, however, the hardest in terms of sheer luck.Step Two: Electronic telephonic assessmentAfter the registration process closed on 22 May, the selected candidates start getting calls for telephonic assessment. It is an electronic message that congratulates you for being selected to the next round and quickly throws three questions at you which have to be answered within a few seconds. Basically, no time to plan a peek-a-boo! You know it or you don't.I received the magical call on 27 May 2020. I was asked a question each about art and culture, economy and KBC's favourite topic- Nobel Prizes. Options were not given for the third question. I personally felt that the level increased with each successive question. I wasn't sure whether I had answered the last question accurately. It was a calculated guess made under 10 seconds! So didn't expect it to go any further. However, soon I was about to be proven wrong.Step Three: The App Based Online AuditionOn 1 June 2020, I received a call from KBC team. The person congratulated me for making it to the next round and asked me to be ready to give an App based audition on 6 June 2020. I was also asked to mail certain proof documents.The audition consisted of two parts. First, a quiz comprising of 20 GK questions took place for all the selected contestants simultaneously. The quiz was in a MCQ format and the questions were extremely general- current affairs, history, geography, science, bollywood, etc. Basically everything under the sun.Each question had to be answered within 20 seconds, once again, extremely tightly timed.Once the GK test was over, 6 pre-recorded video messages by Amitabh Sir were shared through the app. Each video had a personal interview/slam book type question and had to be answered by making a video. It is requested that dark coloured clothes are worn and video is shot in a well lit room with no background noise (including the fan's noise).I was given about a day to record and upload the videos. Believe me when I say this, it isn't easy. This was hands down the toughest step for me. The questions, were while simple and straightforward, they made me think a lot. I must thank the KBC team for this self awareness exercise they made me go through, especially during the lockdown.Step Four: Personal Interview via Virtual MeetingAfter about a month, in the second week of July, I received a call from the KBC team that congratulated me for making it to the next round. I was then asked to be ready for a virtual meeting scheduled to take place after 4 days.The same instructions about clothing and room being well lit and sound proof were repeated. I was also asked to fill (handwritten), scan and submit a 14 paged questionnaire within a day.The interview took place on the designated day, a little earlier than the designated time via a virtual meeting app. The interviewer was quite humble and tried making me comfortable about the process. The interview went on for about 50 minutes during which we mostly chatted about the videos and questionnaire submitted earlier. Towards the end, the interviewer took a surprise GK test of 20 questions, once again in MCQ format to be answered under 20 seconds each.And that was all!Step Five: The Long WaitAfter the personal interview, I received no further intimation from the KBC team till late October. In fact, the show had started airing and I was convinced that I hadn't made it. Thus, I completely forgot about it.Step Six: The Sudden GK TestIt was in the last week of October 2020 that i received a call from KBC team to be ready for a GK test scheduled the next day. Next evening, at around 9 30 pm, I faced another senior member from the production team on a virtual meeting. Once again, I faced 20 GK questions in MCQ format, to be answered within a few seconds each.This was by far my most average performance in their GK tests. However, I personally felt that the question set was the toughest till that point. So, I started considering a possibility of making through it.Step Seven: The Final CallTwo days later, I received a call from the KBC team to congratulate me for making it to the Fastest Finger First (FFF) round. I was invited to Mumbai along with a companion.Details about the basic logistics (stay, travel, quarantine, etc) were exchanged. As told by many participants in the past, the stay and flight expenses of the contestant and one companion are borne by the production. All arrangements, even the web check in, etc are managed by them. So it is quite seamless in that sense.Over the next week, till the time my father and I flew to Mumbai, we went through multiple calls with various KBC production teams. This also included shooting lots and lots of bits using my phone for the short video that they show during the show. The reality shoot team guided us quite patiently through this process.As mentioned earlier, since the show is being shot during these extraordinary times, I doubt this is their usual modus operandi.We also got a call regarding the clothing requirements. Each contestant and their companion is asked to arrange about 10-15 pairs of clothes. Certain colours, textures, stripes, etc are not allowed as they can cause jittering issue with the camera. Clothes with brand logos are also not allowed.Step Eight: MumbaiOn reaching Mumbai, we were checked into a comfortable hotel stay. All individuals were isolated till the COVID-19 RT-PCR tests were conducted.Obviously, since my father tested positive, he was immediately isolated into the quarantine facility where he was duly taken care of under a doctor's supervision till he tested negative.Despite testing negative, the rest of us contestants and companions were strictly asked to not leave our hotel rooms to avoid exposure. Thus, unlike what I had read online from previous contestants, the contestants did not get to interact a lot this season.The only time we left our room was a day before shooting began, for a final GK test that took place in the hotel common area. Proper physical distancing measures were maintained during the 20 minutes test. Once again, the MCQ test comprised of about 20 questions.Step Nine: D-DayEarly morning we were taken to the shooting location with all our luggage. Extremely high standards of hygiene and sanitization are maintained in the building. Masks are mandatory. After sanitization spray, body temperature and oxygen level are checked. Then, one is asked to sanitize their hands. Only after that, one can enter the building.Then we were taken to the dressing area. All our clothes are collected for steam ironing and selection. The team decides what will be worn by each contestant. The hair and makeup is also decided by the KBC team. The production team is extremely experienced, on a countdown timer mode and yet, each person is extremely polite and helpful.Food, tea and even kaadha is provided to all the team members as well as contestants.Once ready, all the contestants are taken to the set for practice. No electronic gadgets and watches are allowed into the set. We had to keep wearing our masks till the shoot began. So when the shoot would start, we were asked to keep the masks behind us on the seat or in the pockets, if the attire had any.To make everyone comfortable with the FFF touchscreen, 4-5 mocks are done. Even a mock for sitting on the hotseat is done to make everyone comfortable with the set.And that is all, after that the game begins. It is shot in a flow, cuts only when Amitabh sir announces a break and that cut too is for a very short period.The only big cut happens after each FFF round once the contestant for the hotseat is selected.When I cleared the FFF, upon reaching the hot seat, we got a 10-15 minutes break during which they did my touch-up, wished me luck and asked me to stay calm.Post that, I played the most grand quiz of my life, quit at the 13th question, got clicked the quintessential photo with Amitabh sir around the hotseat, and left the stage for the next round of FFF to begin!I hope this answer was helpful to those who have been curious about the show's process. One of the most valuable things that I learnt after participating in KBC this year was that most of the contestants that make it till the FFF stage are talented. Some might appear below average because of the few questions they faced in front of the camera, but believe me, they handled and cleared a rigorous process before reaching the KBC set!Big salute to all KBC FFF Contestants! 🙏
What is the new Apple U1 chip, and why is it important?
What is the Apple U1 chip, and why is it important.The biggest Apple announcement today was what Apple actually didn’t announce—yet.“Hey Siri, we lost Spot the dog, do you know where he is?”Siri:”Spot is located 87 feet forward and down 2 feet from the height of the iPhone. Please hold up your phone and follow the Balloon to Spot’s location”Today, September 10th, 2019 Apple announced the iPhone 11 and iPhone 11 Pro series of phones. Not mentioned on the stage, but briefly shown on the screen during Phil Schiller’s presentation was the new Apple U1 chip. Hidden in plain sight much like how he pre-announced Apple Pay, we see the Apple U1 Chip there for the world to see, yet most missed it until after the event.Specimen Apple Event September 10th, 2019 showing Apple U1 Chip.In 2012 Phil did precisely the same thing during the announcement of TouchID as a credit card machine, a Hypercom device [0], was presented on the screen as a potential use case for TouchID. I wrote this would be a nearly 100% confirmation of what became Apple Pay (I called it the iWallet, I know very 2012 of me). Many folks in the payment industry including disruptive startups thought me insane and went about becoming redundant when Apple Pay was released. Of course I had far more basis than a single Phil image. History is about to repeat itself.Specimen of Phil showing TouchID use cases in 2012.Why was the Apple U1 chip on a graphic behind Phil and not announced overtly and only lightly mentioned on the Apple website? We will explore this in more detail at the end. Yet the first mention of how Apple will use the U1 Chip was presented on Apple’s own website as a new highly directional version of AirDrop.The text on Apple’s website for iPhone 11 series says it all:Ultra Wideband technology comes to iPhone.The new Apple‑designed U1 chip uses Ultra Wideband technology for spatial awareness — allowing iPhone 11 to precisely locate other U1‑equipped Apple devices. Think GPS at the scale of your living room. So if you want to share a file with someone using AirDrop, just point your iPhone at theirs and they’ll be first on the list.And:Can you be more precise? Yes.The new Apple‑designed U1 chip uses Ultra Wideband technology for spatial awareness — allowing iPhone 11 Pro to precisely locate other U1‑equipped Apple devices. It’s like adding another sense to iPhone, and it’s going to lead to amazing new capabilities.With U1 and iOS 13, you can point your iPhone toward someone else’s, and AirDrop will prioritize that device so you can share files faster.And that’s just the beginning.Just the beginning, indeed.Specimen Apple website September 10th, 2019 showing Apple U1 Chip promotion.Specimen Apple website September 10th, 2019 showing Apple U1 Chip promotion.Meet Ultra-Wide Band Radio TechnologyThe “U” in the U1 chip relates to the Ultra-Wide Band Radio Technology (UWB) [1] technology it uses. UWB can be used for many application and use cases. One use case that will become very large for Apple as they move to AR/MR technology and Apple Glasses is to be able to track spatial relationships of objects. One way to do this is using lasers and IR systems, and Apple is already doing this to some degree with FaceID and Animoji. The other way to do this is via the radio spectrum.The Apple U1 Chip most assuredly uses a variant of the IEEE 802.15 WPAN from the IEEE 802.15.4z Enhanced Impulse Radio group of which Apple is an active member. IEEE 802.15.4z to put in simple terms wants to absorb, in some ways, and extend Bluetooth, NFC, WiFi and other network standards and protocols.The early concept of this technology was used in an all but abandoned Apple initiative called iBeacons [2]. This technology was centered around Bluetooth and Bluetooth Low Energy (BLE). The idea was sound, however the technology was low resolution, so low that it would be hard to be with-in a few feet without triangularization of 3 or more iBeacons and even then it can drift significantly with heat and obstacle issues.Parallel to the iBeacon research, Apple was testing a newer and more exacting technology in their research labs in 2005. By 2006, before the iPhone was even announced, they applied for a patent for “Ultra-wideband radios for time-of-flight-ranging and network position estimation” via a research grant at Livermore Labs. It took until September 2010 for the patent application to be released by the US Patent and Trademark Office. Apple went on to do a lot of work with iBeacons and BLE. However, the 3 dimensional spatial resolution was not nearly as accurate as Apple needed and thus they abandoned the concept of the iBeacon.The “Smoking Gun” Apple UWB PatentsMore recently Apple has applied for a few more patents centered around UWB. Inventor Joachim S. Hammerschmidt has developed some amazing extension of this technology. Apple inventor Benjamin Vigier has also contributed greatly to the UWB beacon concept. Joachim is a bit of an Apple patent savant submitting a few dozen patents on UWB and other radio frequency technologies.Specimen Joachim S. Hammerschmidt Apple Patents.Even though the concepts of UWB has been around since the dawn of radio first developed in a useable way by RCA in the 1950s, the miniaturization and low power chips has it taken on this new form.I have surfaced three very interesting Apple patents centered around UWB that most have overlooked.Beacon Triggered Processes 2019 (United States Patent Application: 0190272567)Ultra-wideband radios for time-of-flight-ranging and network position estimation 2006 (United States Patent Application: 0100225541)TIME INSTANT REFERENCE FOR ULTRA WIDEBAND SYSTEMS 2018 (United States Patent Application: 0190199398)Pulse Shaping Interoperability Protocol for Ultra Wideband Systems 2017 (United States Patent Application: 0190007093)Specimen Joachim S. Hammerschmidt Apple Patent embodiment.Clearly Apple had predicted the need for UWB as far back as the early 2000s. There are more Apple patents that relate to this technology, but these give an interesting overview:Beacon Triggered ProcessesAbstractTechniques and systems for beacon triggered processes are disclosed. A described technique includes causing a beacon device to broadcast a beacon message, the beacon device being within the vicinity of an establishment; detecting a presence of a user of a mobile device based on receiving from the mobile device a first message that is responsive to the beacon message; retrieving a transaction record based on a user or mobile device identifier in the first message; generating and transmitting a second message based on the transaction record to facilitate a completion of a transaction associated with the transaction record at the establishment; the second message being configured to provide notification of an arrival of the user and dispatch an employee to meet the user and handle the transaction; and generating and transmitting a third message based on the transaction record to facilitate the completion of the transaction at the mobileSpecimen Benjamin Vigier Apple Patent embodiment.And:Ultra-wideband radios for time-of-flight-ranging and network position estimationAbstractThis invention provides a novel high-accuracy indoor ranging device that uses ultra-wideband (UWB) RF pulsing with low-power and low-cost electronics. A unique of the present invention is that it exploits multiple measurements in time and space for very accurate ranging. The wideband radio signals utilized herein are particularly suited to ranging in harsh RF environments because they allow signal reconstruction in spite of multipath propagation distortion. Furthermore, the ranging and positioning techniques discussed herein directly address many of the known technical challenges encountered in UWB localization regarding synchronization and sampling. In the method developed, noisy, corrupted signals can be recovered by repeating range measurements across a channel, and the distance measurements are combined from many locations surrounding the target in a way that minimizes the range biases associated to indirect flight paths and through-wall propagation delays.And:TIME INSTANT REFERENCE FOR ULTRA WIDEBAND SYSTEMSAbstractEmbodiments enable communicating Ultra Wideband (UWB) devices to collaborate by exchanging pulse shape information. The UWB devices use the pulse shape information to improve ranging accuracy. The improved ranging accuracy can be used in complex multipath environments where advanced estimation schemes are used to extract an arriving path for time-of-flight estimation. To determine the pulse shape information to be shared, some embodiments include determining location information of a UWB device and selecting the pulse shape information that satisfies regional aspects. The pulse shape information includes a time-zero index specific to a ranging signal that is used by UWB receivers to establish timestamps time-of-flight calculations. Some embodiments include measuring performance characteristics and selecting different pulse shape information based on the performance characteristics for improved accuracy.And:Pulse Shaping Interoperability Protocol for Ultra Wideband SystemsAbstractEmbodiments enable communicating Ultra Wideband (UWB) devices to collaborate by exchanging pulse shape information. The UWB devices use the pulse shape information to improve ranging accuracy. The improved ranging accuracy can be used in complex multipath environments where advanced estimation schemes are used to extract an arriving path for time-of-flight estimation. To determine the pulse shape information to be shared, some embodiments include determining location information of a UWB device and selecting the pulse shape information that satisfies regional aspects. The pulse shape information includes a time-zero index specific to a ranging signal that is used by UWB receivers to establish timestamps time-of-flight calculations. Some embodiments include measuring performance characteristics and selecting different pulse shape information based on the performance characteristics for improved accuracy.In the “Pulse Shaping Interoperability Protocol for Ultra Wideband Systems” Apple patent we find very enlighten embodiments:[0021] Precise knowledge of pulse shape information used at a station's transmitter allows the use of receivers that isolate pulse shaping or other filtering effects from true propagation channel effects. Knowledge of the pulse shape information also allows the use of signal processing techniques that may be referred to as "deconvolution" techniques--methods to look at an overall received signal (e.g., end-to-end impulse response from transmitter to receiver) and factor out known artifacts such as, for example, transmitter pulse shaping including antenna effects or receiver transfer characteristics. These signal processing techniques allow extraction of a desired contribution of a wireless propagation channel in the overall system response; in turn, this extraction can be used to determine a time instant of an arriving propagation path.Other embodiments present an example system that may include but is not limited to UWB devices such as wireless communication devices (iPhones # 110 and 120), vehicular transponder device (#130), entry transponder device for doors (#140), a household device (#150 thermostat), pet leash tag (#160), and anchor nodes l70al70c.I have been studying patents for over 35 years and in particular Apple patents. The Apple UWB patents have been of great interest to me as I knew UWB will become not only an indoor mapping system, like GPS for indoor spaces, it will become crucial to AR/MR/VR environments for fine tuning of spatial coordinates. UWB will also be very useful with automobiles, drones, and robotic systems. I wrote a few reports for clients and one VC commented that “this would forever change the way we view indoor spaces”, I agree.These Apple patents are a potpourri of ways Apple can and very likely will use UWB. I could literally write a book on how this will play out for Apple just via these patents. Some of what I have learned since 2010 I will use in this answer.The Apple UWB Personal Radar SystemUWB can also serve as a sort of personal radar that can self reference the waves it sends out and echo locate your spacial world with a high degree of precision. I can see this as one way to avert some folks hunched over and thumb clawing at the screen walking down the street and the iPhone puts up a notification of an imminent collision.Specimen of a whimsical radar screen.Although humorous, if Apple makes this into an open standard, and there is evidence some aspects may become open sourced, imagine a world where UWBs are in all automobiles and pedestrians. The collision detection and avoidance systems can become very powerful, and save many lives.How does UWB Work?UWB IEEE 802.15 WPAN devices collaborate with each other by exchanging pulsed shapes of information that can be used for a future ranging exchange. This is achieved by the receiving UWB devices using the pulse shape information to improve the ranging accuracy. The improved ranging accuracy can be used in complex multi-path environments where advanced estimation schemes are used to extract an arriving path for time-of-flight estimation. Time-of-flight is the basis of how UWB works, very much like GPS.Specimen of waveforms used to calculate time of flight.The pulse shape information includes a time-zero index specific to a received ranging signal that is used by UWB receivers to establish timestamps for time-of-flight calculations.This includes measuring performance characteristics and selecting different pulse shape information.The UWB receiving pulse shape information from other devices, where the pulse shape information is used in UWB communications between the electronic device and the another electronic device, receiving a ranging signal that uses first pulse shape information, and determining a distance between the electronic device and the another electronic device based at least in part on the pulse shape information and the ranging signal.Thus determining the distance includes calculating a time-of-flight associated with the ranging signal and the pulse shape information includes a time-zero index that may be a sample of a main lobe of the pulse shape information (e.g., a first sample or a center sample of a main lobe of the pulse shape information.) The pulse shape information also satisfies one or more regional aspects associated with location information of the electronic device.One or more of the anchor nodes may be used in conjunction with an iPhone or other device to improve the accuracy and reliability of ranging. The devices may triangulate and determine a geographic location that may be used to provide local direction information.The primary UWB can also serve as its own Anchor Node and self reference the ranging signal pattern itself. Very much like personal radar systems, this may not have the same high resolution as using two or more devices to triangulate, but it can be quite useful.Apple U1 ChipThe Apple U1 chip is a application specific low power chip design very much like the new Decawave impulse radio ultra-wideband (IR-UWB) DW1000 Radio IC chip set. They have sold millions of these chips thus far and has better than 10 centimeters guaranteed indoor accuracy. It is very possible the Apple U1 chip uses the Decawave chip, licensed technology or a customized OEM version. It is possible to achieve better than 3 centimeters of accuracy in theory with this technology.Specimen Decawave DW1000 chip.The Decawave DW1000 Radio IC [3] for example, can move 6.8Mbps of data with an accuracy that is 100x better than WiFi or Bluetooth. It can reach 290 meters of distance with a very minimal power requirement with a 50x faster speed compared to standard GPS latency. Although the Apple U1 chip is not yet released, I suspect we will expect the same or better specifications. Thus based on the DW1000 as the base, we can imagine just how important the Apple U1 will become.Specimen Apple U1 Chip.Although Apple only mentioned indirectly the U1 chip in the new iPhone 11 series, I think it is likely to be a part of Apple Watch Series 5. And if not yet in Apple Watch Series 5, it will be in a future version. I also imagine the U1 in AirPods, AirPod cases, Apple Glasses, MacBook Pros and of course in a stand alone device similar to the Tile.Specimen Apple U1 Chip.The very low battery consumption of the U1 chip may make it possible for a single coin cell hearing aid battery to have a life of a year or more in normal settings. It is also possible to use the ambient radio frequencies that surround all of us to charge the battery using the patented technology from the Apple acquired company Passif [4]. It seems a very natural use case for Passif technology and was once used to power early internal testing of iBeacons.Specimen of UWB topography.Using the very low power consumption of the Apple U1 Chip and Passif ambient radio wave battery charging, it seems that we may see a less than 2 inch disc, what I will call this Locate or AppleLocate (internally called Rose Tag by Apple), device in the market. It seems the existence of AppleLocate was revealed in iPhone Find My app. Thus it seems very likely one of the first things we see utilizing the U1 chip, outside of the iPhone 11 series is this new device. Many have speculated this would be a stand alone device and today this seems to be the case. I have asserted this technology would come about since 2012 originally as iBeacons and later as UWB.Specimen of the Locate or AppleLocate image found in Apple software.The AppleLocate tags will be relatively inexpensive, starting at about $20 and likely drifting to less than $2 in high production, it seems likely it will become widely used for countless reasons.We will thus see AppleLocate tags in just about all Apple devices at some point for very precise location tracking and perimeter fencing. For example, you can be notified when the AppleLocate tag has entered or left an area as well as the U! chip built into other Apple devices. It will become orders of magnitude more difficult to steal an Apple product in the future with perimeter fencing breech notifications.HyperLocal And HyperPrivateIt is entirely possible to build a useful AR/MR/VR map of any indoor space using the Apple U1 chip in just a few minutes few minutes. This can be utilized with the same laser and/or IR technology found in the iPhone for FaceID. Thus with a combinations of the Apple U1 chip and Apple A13 Bionic neural engine we will have one of the most sophisticated spacial mapping and analyzing systems in any currently available consumer device.Indoor mapping has been tried via many methods over the years, including using the Roomba robotic vacuum cleaner systems. Roomba met with a great deal of push back by users of their products when it was discovered the company may be selling indoor maps of the user’s homes [5] using SLAM technology. In the case of Apple, the U1 chip along with the results of FaceID/TouchID system is stored in the Secure Enclave. The data stored in the Secure Enclave is physically, nearly impossible to retrieve from the chip and is held only local to the device fully encrypted.There will be endless allegations that Apple is trying to collect and sell the results of indoor mapping and other telemetries from the Apple U1 Chip, but this will not be the case. Apple simply does not have access to the information and could not use it without you supplying it.Developer OpportunityIt seems very likely Apple will open up some of the abilities of the Apple U1 chip to developers. This will of course be by explicit permissions granted to the developer much like sharing location data. However I feel strongly Apple will never let a free flow of U! chip data be shared with any developer app. The opportunities with a well crafted API supply by Apple can be quite amazing. From AR use case to collision avoidance systems and just about everything in-between, this will become a very interesting new frontier for Apple developers. I think we will begin to see the API by the Apple World Wide Developer Conference 2020.Apple Glasses: The Held And Worn New Software VersionMore than anything else, the Apple U1 Chip helps telegraph the Apple Glasses AR/MR platform. Internally at Apple this is known under a few names, Project StarBoard is linked to the first generations of Apple Glasses. The ARDisplayDevice SDK in iPhoneOS 13 also clearly confirms an external AR/MR display device. The first versions will use the iPhone 11 Series as a tether base—wirelessly. Much like Apple Watch and Apple CarPlay the Apple Glasses will interact and interplay with full apps on the iPhone. There is likely to be an Apple U1 Chip built-in to the Apple Glasses as well as new Siri chips and Bone Conduction sound chip Apple has been working on. The Apple U1 Chip on the Apple Glasses and on the iPhone will work together as node anchors.Specimen of a hypothetical Apple Glasses display monitor using AirPlay.Apple will allow for two versions of the same code to run. Called held as in the iPhone and worn as in Apple glasses. And like CarPlay the worn version for the first generations of Apple glasses will have less information. Apple has slowly been adding features that will use Apple Glasses in Apple Maps, Find My app and other Apple apps to help build the infrastructure and early developer interest in Apple Glasses. Thus as we see use cases slowly rolled out by Apple for the Apple U1 Chip, imagine how it will related to Apple Glasses. As this slowly plays out, it will become abundantly clear how deeply Apple has been thinking about this.The Apple U1 chip will be used to help decode hand gestures in an AR/MR/VR spacial environment. The unfortunate aspect is just like the QWERTY keyboard there may be dozens of “standards”. It is my sincere hope that American Sign Language becomes the “Silent Voice” for this new user interface. It is robust and nuanced and will include a far wide audience. If we must learn new gestures, let me be gestures that a significant portion of the population already know. A great artifact is compared to thumb clawing at glass screens on the 1870s QWERTY keyboard, ASL is 10X faster. The Apple U1 Chip will help bring this about with the combination of other technologies.If we gave an #ASLVoice to this next generation adopting American Sign Language to spacial gestures we would unlock future generations from thumb clawing on glass screens into the next century.It is a #VoiceFirst future, #ASLVoice will change the world in astonishing ways. https://t.co/iQhEWAPgQT pic.twitter.com/re5CVjQAPs— Brian Roemmele (@BrianRoemmele) September 13, 2019Specimen of open source MediaPipe GPU based software decoding ASL.I have been experimenting with Glasses in my garage lab for a few years that display contextual information and interact via a Voice First system I call The Intelligence Amplifier. I use a number of systems in my cheap glasses including the Decawave DW1000 Radio IC with a very early version of the IEEE 802.15.4z spec. I have had robust success and see no reason why Apple will not do as robustly with this technology. There is no doubt Apple Glasses will be a Voice First device.I‘ve been using a UWB IEEE 802.15.4z WPAN chip from Decawave the IR-UWB DW1000 for over a year with robust success.I can now say I have this chip in my cheap glasses below.Many may find interesting, the Apple U1 Chip is IEEE 802.15.4z and based on the DW1000.Coincidence 🧐 https://t.co/VfZRkLjAZL pic.twitter.com/6jWnXwM8qk— Brian Roemmele (@BrianRoemmele) September 12, 2019Specimen of my cheap glasses modifed in my garage lab..Apple Pay, Retail and Industrial UsesAs I mentioned, there will be countless new use cases for the Apple U1 Chip. I built the first and still the largest Apple Pay Map in the world, PayFinders [6]. One of my challenges was to push a notification to the iPhone user’s phone when they were inside the business, but also close to the checkout. In large stores like Target, I had great accuracy. However In smaller stores the boundaries were in and sometimes near the store. I urged Apple to use Bluetooth at top line merchants to help users know where an Apple Pay Credit Card machine was located and operational. My research showed people just did not want to ask or even test in most circumstances. With the AppleLocate tags on Credit Card Machines, the Apple Pay user can be directed precisely to with-in a few millimeters.The same can be said of product locations. The “Beacon Triggered Processes” Apple patent just released on September 5th, 2019 goes a long way on how the Apple U1 Chip will be used with Apple Pay to begin and complete a sale. Apple Stores are already testing AppleLocate tags in their stores today. Although it takes some interesting radio frequency equipment to find them, I have been successful at two locations. The use case will allow for you to find a product like you world on a website with a whimsical Balloon, also used in the Find My app, to direct you to the precise location of the Apple product. With FaceID and Apple Pay you just look at your phone and confirm and leave. It is not hard to imagine many retail businesses adopting the system. It is also not hard to imagine AppleLocate used in industrial locations and medical locations. I will have much to say about this over the next few months as I have studied these use cases in depth for over a decade.Finally it is entirely likely a form of UWB will fully replace NFC for many payment transactions in the physical world. Although this will not take place any time in the next few years because the merchant payment systems are always on a 5 to 8 year upgrade cycle, however it will be likely to take place in the next decade.Open Sourced Apple UWB TechnologyI think it is likely some of the technology around the AppleLocate system and the Apple U1 Chip will be open sourced for adoption of other companies. Much like elements of AirPlay it would make a great deal of sense for Apple to get as many manufactures to adopt this system as possible. Apple is light years ahead of Google, Samsung and Amazon in intellectual property and public patents that would give them a big edge in home automation and help guide Siri to a more favorable position in the home.Speculative But Very Likely Use Cases Of The Apple U1 ChipI have written a few reports over the last few years on how UWB technology will be used in the future. I presented the rather certain use cases in this article, however there will be many others. Time and space constraints presented here I will mention a few:Bitcoin wallets and merchant payment systemsMedical biometricsVoice First HyperLocal HyperContextual systemsWhy Apple Did Not Announce The U1 Chip?So with all of these amazing attributes, why did Apple not announce the Apple U1 Chip? I assert it is a confluence of things:The iPhoneOS software needed is not yet releasedApple will release AppleLocate tags for holiday shopping 2019Apple had too many things to announce at this Apple Event and this would take too much timeApple is aware of the privacy implication many will cast and wants to spend more time to explainOther issues I can not present at this moment in timeThus it was not in the cards to hear anyone on the stage talk about the Apple U1 Chip but it certainly was presented behind Phil during his time on the stage and later indirectly communicated as part of a new version of “directional” AirDrop. Some people find it intriguing that AirDrop will use the Apple U1 Chip, however it will make much more sense how AirDrop will work in an AR/MR world that is on the map for Apple. In the meantime, being able to precisely locate via people and device via AirDrop will allow for new ways to send larger amounts of private and encrypted data.The findable abilities of the U1 chip will bridge across a multitude of Apple systems and software. It begins with AirDrop:, the largest file sharing social media network for the 14-23 age cohort. Quite hidden on campuses across the US AirDrop and Apple Messages serve as an ad-hoc HyperLocal HyperPrivate social network. Built into all new iPhones is this new permission based HyperLocal Social Network with permission based people finder systems built-in. Before we hear how bad this is, understand this is permission based and by invitation.The new AirDrop will allow for extremely high resolution to discover and send files and other new elements to friends and devices in a room. The precision is within millimeters. This will be achieved by simply pointing the phone in the direction of the person and a screen animation will allow in a held or worn (think Apple glasses) position to visualize the direction and avatar of the recipient.Specimen of the new AirDrop location system.We may begin to see the Apple U1 Chip take shape as early as September 30th, 2019 but more likely over the last quarter of 2019 on to the announcements of Apple Glasses.The Biggest Apple Announcement Today Was What Apple Actually Didn’t Announce—Yet.Apple began to give developers a small insight on the future of some aspects of the Appel U1 Chip during the World Wide Developer Conference 2019 {7]. The talk on “Introducing the Indoor Maps Program” will begin to more sense today. In the talk there was indirect reference to how Apple may use this technology.The accelerometer systems, GPS systems and IR proximity sensors of the first iPhone helped define the last generation of products. The Apple U1 Chip will be a material part of defining the next generation of Apple products.This all will move us to the ambient world of computing where the mechanical user interface will become less needed and situational images and video will present on the closest Apple U1 Chip enabled display. In some ways most of this is already here. I was urged to post a sort of recursive video below about this Quora answer as an example of the non-mechanical future.My drive this morning with Ms. https://t.co/qdKFgeC498 #TheIntelligenceAmplifier.I was remiss and did not have her build my last 24 hours better but she is up to the task and found some solutions.I seriously don’t know what I would do with out her and https://t.co/bvJPYnwNzN. pic.twitter.com/MUzo9rl0Cd— Brian Roemmele (@BrianRoemmele) September 12, 2019Specimen real-time video interaction with Agatha.Best, The Intelligence Amplifier from my garage lab.I feel rather strongly the Apple U1 Chip, over time will be seen as one of the most important aspect of the September 10th, 2019 Apple Event. We will see it as the start of the HyperLocal world of computing that ultimately will lead to less of a need for the cloud. The Apple U1 Chip is the start of this process of HyperLocal and HyperContextual computing where holographic crystal memory [8] and very fast local computer speed will render the cloud as we know it redundant and far less useful. With Petabytes of data on every device, all of your data and a useful base of the Internet will be local in a chip, on the device. This is far more than the speculated IoT edge computing and the Apple U1 Chip is one important part to bring this about. We will once again leave the Mainframe computer and become—cloudless.[0] Brian Roemmele's answer to Why is Apple’s iPhone Touch ID Important?[1] Ultra-wideband - Wikipedia[2] Brian Roemmele's answer to What are some interesting applications that are enabled by iBeacons technology featured in iOS 7?[3] DW1000 Radio IC - Decawave[4] Brian Roemmele's post in Accepting Payments[5] Hey, Apple and Google: iRobot wants to sell its Roomba vacuum indoor mapping data[6] http://PayFinders.com[7] Introducing the Indoor Maps Program - WWDC 2019 - Videos - Apple Developer[8] Formation of holographic polymer-dispersed liquid crystal memory by angle-multiplexing recording for optically reconfigurable gate arrays
What are some signals of us living in a simulation?
Due to a recent sharp increase in news coverage of the universe as a simulation and holography, I have added a brief edit that includes this information with the hope it might be of some help to both new and previous readers. Thanks for your interest and the over 1,000 views so far!(updated info is shown in bold italics)In 2010 I had surgery to remove a benign tumor on my spinal cord. Later that day I was told I was paralyzed from the chest down (T-4) and it was highly likely I would never walk again.While processing the sudden loss of all that had previously defined "who I was" -I immersed myself in study to address the classic existential "Big Questions" "why am I here" "what will happen when I die". While I am certainly not a PhD Theoretical Physicist and therefore unable to fascinate you with new and elegant mathematical equations supporting this hypothesis, I do offer the following as to 'my story'.Today my research continues to leave me with several more existential "Big Picture" type questions. It helps that their is increasing support from mainstream Science for the Holographic Universe Theory and other Universe as Simulation Theories. In fact the 58th richest man in the world, Elon Musk (Space X and Tesla Motors) addressed Re Code’s Code Conference on June 1, 2016 and in a Q & A session, pronounced that “the odds that we are not living in a simulated universe are 1 billion to 1!”What might have begun as science fiction and whispered non-empirically tested speculation—i.e. the universe as we know it- actually is a computer simulation—today has emerged as a serious line of theoretical and experimental investigation among physicists, astrophysicists, and philosophers.On April 5, 2016 The American Museum of Natural History sponsored the 2016 Isaac Asimov Memorial Debate. Hosted by Neil deGrasse Tyson who led a high powered panel of experts that included Max Tegmark- M.I.T., Sylvester James Gates-University of Maryland and a past Sr. Science Advisor to the Obama Administration, David Chalmers-NYU, Lisa Randall- Harvard University and others of note in an interesting discussion-rather than an actual debate- on the topic “Is The Universe a Simulation?Mr.Tyson, during the program and as later quoted in several news outlets, stated the likelihood we live our lives in a simulated reality is “50-50”. The host of the popular TV special/series “Cosmos” hedged a bit more than Mr. Musk- but I assume this well correlates to their different philosophies of life, scope of defined goals, and career accomplishments.Additional mainstream media coverage of high profile celebrities and science luminaries who have broached the topic of the simulation theory included Hollywood “A” lister, Matt Damon. In what was described as a near-reprise of his 1997 performance in “Good Will Hunting” Damon gave an equally passionate performance in his delivery of the Massachusetts Institute of Technology (MIT) Commence Speech for the Class of 2016, held on June 3, 2016 . A recurring theme of his address ? “What if the world we live in is a simulation?”.Damon even used the simulation argument (with a possible assist awarded to Hugh Everett and his “Many Worlds Theory”) to take a shot at Donald Trump. “If there are multiple parallel universes how did we get stuck in the one with (Trump) in it? Can we transfer universes?”Another of science’s “telegenic personalities”, Michio Kaku, startled the science world when he pronounced that he had discovered “proof of God” and that we live in a Matrix-style universe created by an intelligence. This radical change in course was further emphasized by Kaku- "Believe me, everything that we call chance today won't make sense anymore. To me it is clear that we exist in a plan which is governed by rules that were created, shaped by a universal intelligence and not by chance."I submit that a simulated universe and specifically the Holographic Universe Theory serves as an explanatory platform that seems to provide the best and most satisfying answers among the many alternative theories I studied . The fact that only recently, among the most publicly recognized faces of mainstream science have decided a simulation of reality best addresses the following open issues, adds a certain affirmative degree of confidence, notwithstanding, I believe the answers seem to flow with or without their stamp of approval.See if you agree.Is it not odd that Quantum Mechanics reveals a probabilistic component to "reality" at the ground of all existence?What does the Universe reveal about itself to us, in that it is described so elegantly, so perfectly, by the language of mathematics? There is absolutely no reason whatsoever that “reality” should conform to any laws- rather, reality should be in a constant chaotic state where things should just pop into and out of existence. Disorder should be the base case used in describing our Universe- not the extreme order routinely revealed to us by Mainstream Science.How did our brains evolve to become akin to highly integrated computers with our eyes telling us what we see "out there" is the product of cones and rods and the processing of electrical signals that form a"picture" inside our pitch black, warm, wet, brains?What advantageous mutation(s) was/were selected that transformed these incredibly complex machines from their base comprised mainly of masses of fatty tissue not unlike the organs the heart, liver, bladder, that somehow became “self-aware”"conscious" and information processing capable?Facing many brushes with death after leaving the hospital and rehab I was keenly interested in our fates post death. Is there continued existence after death-do we not all experience a void deep within, an empty spot that many- like myself- have tried to infill with material possessions, love interests, money, food,alcohol, etc. that regardless of the degree or amount of success one has in the gathering, we at some point learn that after a short burst of the initial thrill, they still leave us empty-longing for something bigger, more satisfying than a quick 70-80-year trip through the materialist reductionist world and all the while looking forward only to an infinite “dirt nap”?Check out the research of Stuart Hameroff and Roger Penrose with their theory “Orch-OR” or Orchestrated- Objective Reduction that posits the structural lattice material in our neurons (brain cells)-called microtubules are capable of processing information at incredibly high speeds-and become entangled with space-time- the fabric of the universe. This makes the existence of the religious concept of a“soul” that survives our bodily existence after death a very real possibility. Often ruthlessly peer criticized, and with Wikipedia contested- their theory remains standing without any peer review takedown to date.Say what you will about near death experiences (NDE) but there are literally millions of credible examples that world-wide share the same (or quite similar) experiences. Post experience, the NDE’rs make often abrupt positive, less self-centered, life-direction changes which is inconsistent with O2 deprivation or CO2 related “bad trips”. The experiencers overwhelmingly state that their recollection of visiting the hereafter was “clearer than clear” focused to a visual acuity “so vivid, beyond any form of eyesight we have on earth”. These accounts obviously do not support the frenetic brain activity of an O2 starved brain nor an overload of CO2 distressed condition.Nearly across the board, the NDEr reports gaining a freedom of any fear they may have had of their now future impending, more permanent bodily death.According to one study the fear of death affects 68% of the total US population.Struggling with the vastness of space and our relative insignificance is indeed a valid philosophical quandary. However, does it not all seem like less a huge waste when reconsidered in terms of the sum of all Dark Energy and Dark Matter with 5% visible matter if it is not "real"?Consistent with any game program our universe also has built-in max and min values/speeds/spatial limits and according to QM “the observer effect” the program only animates i.e. collapses the wave function when a conscious observer “enters” e.g.the room or battlefield which enormously cuts down on memory and rendering the issue of power required moot. While it is an enormous oversimplification but in general lay terms a simulated reality is not too dissimilar from playing and- living within- a home video game that extends our horizons "light years"away from us?How can the material universe and all of its apparent solidity be made of atoms that consist of 99.9999999% empty space?Atoms almost universally are shown as renderings in text books, and other notable sources of Science academia and related media- which for many seems seared into our psyche- as tightly drawn, NASA logo-like molecules with the electron orbiting the nucleus. The electron (s), if it were possible to draw it to scale, would literally be 1-2 miles away with nothing but empty space in between? In fact even the nucleus pops in and out of existence as does the electrons. Solidity anyone?Consider all of the constants and quantities that are calculated to define our finely tuned universe. Examining just the cosmological constant (Lambda) which is set to a delicate 1^120 (that is 120 orders of magnitude or a 1 with 120 zeros behind it) which if it varied only slightly -life on earth would be impossible- same for the very existence of the universe as it would have instantly collapsed upon itself- as such, there would be no stars producing the elements for carbon based life as we know it, the laws of chemistry, biology, physics, astrophysics, and on and on, that would never have made it past the first split second moment when all time, space, and matter came into existence out of a state of pure "nothingness"?When considering pre-Big Bang scenarios many theories put forth by names like Hawking, Krauss, and others who continue to violate the next to impossible concept of true nothingness by inserting quantum fluctuations within the vacuum state and other-"things" as it means what it says- pure nothingness. Does it not appear to be quite similar to the process of a “Cosmic computer simply rebooting”?And is it reasonable to trust that all of this is best explained as purely as the product of a random chance occurrence?Consider the odds of this actually happening? Answer: It is like flipping a coin one quintillion times in a row and each and every flip coming up heads!Keeping in mind that this is just one of over 40 similar such “just right”settings. Each parameter appears to be “dialed” to intricate tolerances that somehow were all pre-defined to enact a future “timed release” of life enabling conditions all within the first tiny fractions of a second of the "Big Bang" event.Oh the Multiverse you say? Derived from “String Theory” IMHO this is Science’s attempt to rescue the "chance hypothesis" or lottery example" meaning someone always wins and lucky for us we won the Cosmic Lottery”. The ensemble of an infinite or near infinite number of bubble universes is likely not a falsifiable hypothesis and therefore runs the risk of being shamefully considered pseudo-science.As mentioned above the speed of light is set at 186,000 miles per second (one of the limits of the parameters of the simulation’s program?-yet the results of QM experiments prove again and again that through quantum entanglement information travels instantly at distances up to opposite ends of the observable universe. How can Einstein’s strictly enforced “speed limit” be exceeded by such a counter-intuitive exponentially faster speed?I could go on and on but if this of any interest I also suggest searching for the “Holographic Universe” and Leonard Susskind. I found confidence in that this is not fringe, way out there, New Age, Quantum Woo- wacko stuff- no, this all comes from the very essence of mainstream Science.Alternatively, is this just another in a long line of Scientific proofs that we must tack onto the already lengthy chain of “coincidences”? Susskind and many others have demonstrated that all of the mathematics that describe the universe work out perfectly (even better than 3-D?) in 2-Dimensions. Does it not become more reasonable to consider that we may be living inside and are an integral part of a Super-Super^50-Hi-Def holographic projection whose constituent parts are analogous to “pixels” or quanta (or perhaps even sub-quantum level?) are micro-bits of information that exists and is stored in 2-D (miraculously) on the event horizons of black holes?Last week on May 27, 2o16, Phys.org reported that German physicists have expanded and improved on the prevailing Science/Physics that now- more acurately measures the entropy of black holes (BH) and the 2-D information that is stored on the event horizon-(Holography) per the prevailing “semi-classical” work of Jacob Beckenstein, and Steven Hawking. The German’s who applied a second quantization formalization to loop quantum gravity (LQG) “now have obtained a far more realistic and robust result," said Project Lead -Daniele Pranzetti.An additional, critically important aspect of this study is that “it proposes a concrete mechanism in support to the holographic hypothesis”.An important day in Physics that provides additional incremental support to this now fast evolving, theory that should it become part of a more expanded focus for future scientific enquiry then I believe as Nikola Tesla did when over 100 years ago he said “The day science begins to study non-physical phenomena, it will make more progress in one decade than in all the previous centuries of its existence”I realize that making a living in Science would become even more difficult than most claim it already is if this were to become a main objective/ direction that future hypotheses and experiments were to take, let alone to ponder the wild card of what the public’s actual reaction to all this might be. However, is the foregoing not feasibly explained as an elaborate computer type simulation that Philosophy deduces is best explained as the product of an incalculable intelligence in a programmer's omniscient mind? Search"Inspiring Philosophy" for his succinct, well-reasoned, high quality produced, videos on YouTube.I hope this expanded answer helps you or any other reader to gain an understanding as to what it took for me to achieve some sense of peace through the accumulation of the significant circumstantial evidence that a simulated reality and the Holographic Universe Theory best describes our reality and allows for the existential “Big” questions in life to make more sense given that the evidence strongly suggests a super intelligence is behind it all-Peace to all, and thank you for taking the time to read this…..
- Home >
- Catalog >
- Miscellaneous >
- Evaluation Form >
- Self Evaluation >
- Employee Self Evaluation Form >
- Directions For Using And Submitting The Electronic Self-Study