Handy puk verloren baseContents:
Force Wget to use encoding as the default remote server encoding. You can set the default encoding using the remoteencoding command in. Force Wget to unlink file instead of clobbering existing file. This option is useful for downloading to the directory with hardlinks. Download Options , Up: Do not create a hierarchy of directories when retrieving recursively. Disable generation of host-prefixed directories. This option disables such behavior. Use the protocol name as a directory component of local file names. Ignore number directory components. This is useful for getting a fine-grained control over the directory where recursive retrieval will be saved.
Set directory prefix to prefix. The directory prefix is the directory where all other files and subdirectories will be saved to, i. Directory Options , Up: As of version 1. The old option name is still acceptable, but should now be considered deprecated. At some point in the future, this option may well be expanded to include suffixes for other types of content, including content types that are not parsed by Wget.
Specify the username user and password password on an HTTP server. According to the type of the challenge, Wget will encode them using either the basic insecure , the digest , or the Windows NTLM authentication scheme. Either method reveals your password to anyone who bothers to run ps. If the passwords are really important, do not leave them lying in those files either—edit the files and delete them after Wget has started the download.
The cookie is then resent by the browser when accessing that part of the site, and so proves your identity. Mirroring such a site requires Wget to send the same cookies your browser sends when communicating with the site. Different browsers keep textual cookie files in different locations:. This has been tested with Internet Explorer 5; it is not guaranteed to work with earlier versions. Save cookies to file before exiting. Session cookies are normally not saved because they are meant to be kept in memory and forgotten when you exit the browser.
Saving them is useful on sites that require you to log in or to visit the home page before you can access some pages. With this option, multiple Wget runs are considered a single browser session as far as the site is concerned. Since the cookie file format does not normally carry session cookies, Wget marks them with an expiry timestamp of 0.
You can spot this syndrome if Wget retries getting the same document again and again, each time claiming that the otherwise normal connection has closed on the very same byte. With this option, Wget will ignore the Content-Length header—as if it never existed. Send header-line along with the rest of the headers in each HTTP request. The supplied header is sent as-is, which means it must contain name and value separated by colon, and must not contain newlines.
Specification of an empty string as the header value will clear all previous user-defined headers. As of Wget 1. In versions of Wget prior to 1. Choose the type of compression to be used. If the server compresses the file and responds with the Content-Encoding header field set appropriately, the file will be decompressed automatically. This is the default. Compression support is currently experimental.
In case it is turned on, please report any bugs to bug-wget gnu. Specifies the maximum number of redirections to follow for a resource. The default is 20, which is usually far more than necessary. However, on those occasions where you want to allow more or fewer , this is the option to use.
Specify the username user and password password for authentication on a proxy server. Wget will encode them using the basic authentication scheme. Useful for retrieving documents with server-side processing that assume they are always being retrieved by interactive web browsers and only come out properly when Referer is set to one of the pages that point to them. Save the headers sent by the HTTP server to the file, preceding the actual contents, with an empty line as the separator.
This enables distinguishing the WWW software, usually for statistical purposes or for tracing of protocol violations. However, some sites have been known to impose the policy of tailoring the output according to the User-Agent -supplied information. While this is not such a bad idea in theory, it has been abused by servers denying information to clients other than historically Netscape or, more frequently, Microsoft Internet Explorer.
This option allows you to change the User-Agent line issued by Wget. Use of this option is discouraged, unless you really know what you are doing. Other than that, they work in exactly the same way. Wget will simply transmit whatever data is provided to it. Any other control characters in the text will also be sent as-is in the POST request. In case a server wants the client to change the Request method upon redirection, it should send a See Other response code.
This example shows how to log in to a server using POST and then proceed to download the desired pages, presumably only accessible to authorized users:.
- Neuer Service: o2 zeigt PIN und PUK im Online-Kundenbereich an - mail.yosemitephotos.net News.
- handy daten spionage!
- iphone ortung ohne sim karte!
- GNU Wget Manual?
- whatsapp spion freeware.
- handyortung gps ausgeschaltet;
If Wget is redirected after the request is completed, Wget will suspend the current method and send a GET request till the redirection is completed. This is true for all redirection response codes except Temporary Redirect which is used to explicitly specify that the request method should not change.
Handy Lexikon - wir klären auf | congstar
If this is set to on, experimental not fully-functional support for Content-Disposition headers is enabled. This can currently result in extra round-trips to the server for a HEAD request, and is known to suffer from a few bugs, which is why it is not currently enabled by default. This option is useful for some file-downloading CGI programs that use Content-Disposition headers to describe what the name of a downloaded file should be.
If this is set to on, wget will not skip the content when the server responds with a http status code that indicates error. If this is set, on a redirect, the local file name will be based on the redirection URL.
By default the local file name is based on the original URL. When doing recursive retrieving this can be helpful because in many web sites redirected URLs correspond to an underlying file structure, while link URLs do not. If this option is given, Wget will send Basic HTTP authentication information plaintext username and password for all requests, just like Wget 1. Use of this option is not recommended, and is intended only to support some few obscure servers, which never send HTTP authentication challenges, but accept unsolicited auth info, say, in addition to form-based authentication.
Consider given HTTP response codes as non-fatal, transient errors. Supply a comma-separated list of 3-digit HTTP response codes as argument. Useful to work around special circumstances where retries are required, but the server responds with an error code normally not retried by Wget. Retries enabled by this option are performed subject to the normal retry timing and retry count limitations of Wget. Using this option is intended to support special use cases only and is generally not recommended, as it can force retries even in cases where the server is actually trying to decrease its load.
Please use wisely and only if you know what you are doing. FTP Options , Previous: The current default is GnuTLS. If Wget is compiled without SSL support, none of these options are available.
Choose the secure protocol to be used. This is useful when talking to old and buggy SSL server implementations that make it hard for the underlying SSL library to choose the correct protocol version. Fortunately, such servers are quite rare. It has a bit more CPU impact on client and server. We use known to be secure ciphers e. Set the cipher list string. Wget will not process or manipulate it in any way.
Although this provides more secure downloads, it does break interoperability with some sites that worked with previous Wget versions, particularly those using self-signed, expired, or otherwise invalid certificates. It is almost always a bad idea not to check the certificates when transmitting confidential or important data. Use the client certificate stored in file. This is needed for servers that are configured to require certificates from the clients that connect to them. Normally a certificate is not required and this switch is optional. Specify the type of the client certificate.
Read the private key from file. This allows you to provide the private key in a file separate from the certificate. Specify the type of the private key. The certificates must be in PEM format. Each file contains one CA certificate, and the file name is based on a hash value derived from the certificate. Specifies a CRL file in file. This is needed for certificates that have been revocated by the CAs. Tells wget to use the specified public key file or hashes to verify the peer. A public key is extracted from this certificate and if it does not exactly match the public key s provided to this option, wget will abort the connection before sending or receiving any data.
On such systems the SSL library needs an external source of randomness to initialize. EGD stands for Entropy Gathering Daemon , a user-space program that collects data from various unpredictable system sources and makes it available to other programs that might need it. Encryption software, such as the SSL library, needs sources of non-repeating randomness to seed the random number generator used to produce cryptographically strong keys. If this variable is unset, or if the specified file does not produce enough randomness, OpenSSL will read random data from EGD socket specified using this option.
If this option is not specified and the equivalent startup command is not used , EGD is never contacted. Wget will use the supplied file as the HSTS database. If Wget cannot parse the provided file, the behaviour is unspecified. Each line contains an HSTS entry ie. Lines starting with a dash are ignored by Wget. Please note that in spite of this convenient human-readability hand-hacking the HSTS database is generally not a good idea. The hostname and port fields indicate the hostname and port to which the given HSTS policy applies. The port field may be zero, and it will, in most of the cases.
That means that the port number will not be taken into account when deciding whether such HSTS policy should be applied on a given request only the hostname will be evaluated. When port is different to zero, both the target hostname and the port will be evaluated and the HSTS policy will only be applied if both of them match. Thus, this functionality should not be used in production environments and port will typically be zero. The last three fields do what they are expected to.
Once that time has passed, that HSTS policy will no longer be valid and will eventually be removed from the database. When Wget exists, it effectively updates the HSTS database by rewriting the database file with the new entries. If the supplied file does not exist, Wget will create one.
This file will contain the new HSTS entries. If no HSTS entries were generated no Strict-Transport-Security headers were sent by any of the servers then no file will be created, not even an empty one. Care is taken not to override possible changes made by other Wget processes at the same time over the HSTS database. For more information about the potential security threats arised from such practice, see section 14 "Security Considerations" of RFC , specially section Recursive Retrieval Options , Previous: Specify the username user and password password on an FTP server. To prevent the passwords from being seen, store them in.
Normally, these files contain the raw directory listings received from FTP servers. Not removing them can be useful for debugging purposes, or when you want to be able to easily check on the contents of remote server directories e. Note that even though Wget writes to a known filename for this file, this is not a security hole in the scenario of a user making.
Depending on the options used, either Wget will refuse to write to. A user could do something as simple as linking index. Turn off FTP globbing. By default, globbing will be turned on if the URL contains a globbing character. This option may be used to turn globbing on or off permanently.
You may have to quote the URL to protect it from being expanded by your shell. Globbing makes Wget look for a directory listing, which is system-specific. Disable the use of the passive FTP transfer mode. Passive FTP mandates that the client connect to the server to establish the data connection rather than the other way around. If the machine is connected to the Internet directly, both passive and active FTP should work equally well. By default, when retrieving FTP directories recursively and a symbolic link is encountered, the symbolic link is traversed and the pointed-to files are retrieved.
Currently, Wget does not traverse symbolic links to directories to download them recursively, though this feature may be added in the future. Instead, a matching symbolic link is created on the local filesystem. The pointed-to file will not be retrieved unless this recursive retrieval would have encountered it separately and downloaded it anyway. This option poses a security risk where a malicious FTP Server may cause Wget to write to files outside of the intended directories through a specially crafted.
Note that when retrieving a file not a directory because it was specified on the command-line, rather than because it was recursed to, this option has no effect. Symbolic links are always traversed in this case. All the data connections will be in plain text. For security reasons, this option is not asserted by default. The default behaviour is to exit with an error. FTP Options , Up: Turn on recursive retrieving.
See Recursive Download , for more details. The default maximum depth is 5. Specify recursion maximum depth level depth see Recursive Download. This option tells Wget to delete every single file it downloads, after having done so.
It is useful for pre-fetching popular pages through a proxy, e. After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the visible hyperlinks, but any part of the document that links to external content, such as embedded images, links to style sheets, hyperlinks to non- HTML content, etc. This kind of transformation works reliably for arbitrary combinations of directories.
Because of this, local browsing works reliably: The fact that the former links are converted to relative links ensures that you can move the downloaded hierarchy to another directory. Note that only at the end of the download can Wget know which links have been downloaded.
This filename part is sometimes referred to as the "basename", although we avoid that term here in order not to cause confusion. It proves useful to populate Internet caches with files downloaded from different hosts. Note that only the filename part has been modified. Turn on options suitable for mirroring. This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings.
This option causes Wget to download all the files that are necessary to properly display a given HTML page. This includes such things as inlined images, sounds, and referenced stylesheets. Ordinarily, when downloading a single HTML page, any requisite documents that may be needed to display it properly are not downloaded. For instance, say document 1. Say this continues up to some arbitrarily high number.
As you can see, 3. However, with this command:. One might think that:. Links from that page to external documents will not be followed. Turn on strict parsing of HTML comments. Because of this, many popular browsers completely ignore the specification and implement what users have come to expect: Beginning with version 1.
Exit Status , Previous: Recursive Retrieval Options , Up: Specify comma-separated lists of file name suffixes or patterns to accept or reject see Types of Files. Specify the regular expression type. Set domains to be followed. Specify the domains that are not to be followed see Spanning Hosts. Without this option, Wget will ignore all the FTP links.
If a user wants only a subset of those tags to be considered, however, he or she should be specify such tags in a comma-separated list with this option. To skip certain HTML tags when recursively looking for documents to download, specify them in a comma-separated list. In the past, this option was the best bet for downloading a single page and its requisites, using a command-line like:.
Ignore case when matching files and directories. The quotes in the example are to prevent the shell from expanding the pattern. Enable spanning across hosts when doing recursive retrieving see Spanning Hosts. Follow relative links only. Useful for retrieving a specific home page without any distractions, not even those from the same hosts see Relative Links. Specify a comma-separated list of directories you wish to follow when downloading see Directory-Based Limits.
Elements of list may contain wildcards. Specify a comma-separated list of directories you wish to exclude from download see Directory-Based Limits. Do not ever ascend to the parent directory when retrieving recursively. This is a useful option, since it guarantees that only the files below a certain hierarchy will be downloaded. See Directory-Based Limits , for more details. With the exceptions of 0 and 1, the lower-numbered exit codes take precedence over higher-numbered ones, when multiple types of errors are encountered.
Recursive downloads would virtually always return 0 success , regardless of any issues encountered, and non-recursive fetches only returned the status corresponding to the most recently-attempted download. Following Links , Previous: We refer to this as to recursive retrieval , or recursion. This means that Wget first downloads the requested document, then the documents linked from that document, then the documents linked by them, and so on. In other words, Wget first downloads the documents at depth 1, then those at depth 2, and so on until the specified maximum depth. The default maximum depth is five layers.
When retrieving an FTP URL recursively, Wget will retrieve all the data from the given directory tree including the subdirectories up to the specified depth on the remote server, creating its mirror image locally. FTP retrieval is also limited by the depth parameter. By default, Wget will create a local directory tree, corresponding to the one found on the remote server. Recursive retrieving can find a number of applications, the most important of which is mirroring. It is also useful for WWW presentations, and any other opportunities where slow network connections should be bypassed by storing the files locally.
You should be warned that recursive downloads can overload the remote servers. Because of that, many administrators frown upon them and may ban access from your site if they detect very fast downloads of big amounts of content. The download will take a while longer, but the server administrator will not be alarmed by your rudeness. Of course, recursive download may cause problems on your machine. If left to run unchecked, it can easily fill up the disk. If downloading from local network, it can also take bandwidth on the system, as well as consume memory and CPU.
Try to specify the criteria that match the kind of download you are trying to achieve. See Following Links , for more information about this. Recursive Download , Up: When retrieving recursively, one does not wish to retrieve loads of unnecessary data. Most of the time the users bear in mind exactly what they want to download, and want Wget to follow only specific links. Types of Files , Previous: Following Links , Up: This is a reasonable default; without it, every retrieval would have the potential to turn your Wget into a small version of google.
However, visiting different hosts, or host spanning, is sometimes a useful option. Maybe the images are served from a different server. Maybe the server has two equivalent names, and the HTML pages refer to both interchangeably. Unless sufficient recursion-limiting criteria are applied depth, these foreign hosts will typically link to yet more hosts, and so on until Wget ends up sucking up much more data than you have intended.
You can specify more than one address by separating them with a comma, e. Directory-Based Limits , Previous: Spanning Hosts , Up: When downloading material from the web, you will often want to restrict the retrieval to only certain file types. For example, if you are interested in downloading GIF s, you will not be overjoyed to get loads of PostScript documents, and vice versa.
Wget offers two options to deal with this problem. Each option description lists a short name, a long name, and the equivalent command in. A matching pattern contains shell-like wildcards, e. Look up the manual of your shell for a description of how pattern matching works. So, if you want to download a whole page except for the cumbersome MPEG s and.
The quotes are to prevent expansion by the shell. This behavior may not be desirable for all users, and may be changed for future versions of Wget. It is expected that a future version of Wget will provide an option to allow matching against query strings. This behavior, too, is considered less-than-desirable, and may change in a future version of Wget.
Relative Links , Previous: Types of Files , Up: Regardless of other link-following facilities, it is often useful to place the restriction of what files to retrieve based on the directories those files are placed in. There can be many reasons for this—the home pages may be organized in a reasonable directory structure; or some directories may contain useless information, e. Wget offers three different options to deal with this requirement. Any other directories will simply be ignored.
The directories are absolute paths. The simplest, and often very useful way of limiting directories is disallowing retrieval of the links that refer to the hierarchy above than the beginning directory, i. Using it guarantees that you will never leave the existing hierarchy. Supposing you issue Wget with:. Only the archive you are interested in will be downloaded.
GNU Wget 1.20 Manual
FTP Links , Previous: Directory-Based Limits , Up: Relative links are here defined those that do not refer to the web server root. For example, these links are relative:. Relative Links , Up: The rules for FTP are somewhat specific, as it is necessary for them to be. FTP links in HTML documents are often included for purposes of reference, and it is often inconvenient to download them by default. Also note that followed links to FTP directories will not be retrieved recursively further. Startup File , Previous: One of the most important aspects of mirroring information from the Internet is updating your archives.
Downloading the whole archive again and again, just to replace a few changed files is expensive, both in terms of wasted bandwidth and money, and the time to do the update. This is why all the mirroring tools offer the option of incremental updating. Such an updating mechanism means that the remote server is scanned in search of new files. Only those new files will be downloaded in the place of the old ones. To implement this, the program needs to be aware of the time of last modification of both local and remote files. We call this information the time-stamp of a file.
With this option, for each file it intends to download, Wget will check whether a local file of the same name exists. If it does, and the remote file is not newer, Wget will not download it. If the local file does not exist, or the sizes of the files do not match, Wget will download the remote file no matter what the time-stamps say.
The usage of time-stamping is simple. Say you would like to download a file so that it keeps its date of modification. A simple ls -l shows that the time stamp on the local file equals the state of the Last-Modified header, as returned by the server. Several days later, you would like Wget to check if the remote file has changed, and download it if it has. Wget will ask the server for the last-modified date. If the local file has the same timestamp as the server, or a newer one, the remote file will not be re-fetched.
However, if the remote file is more recent, Wget will proceed to fetch it. After download, a local directory listing will show that the timestamps match those on the remote server. If you wished to mirror the GNU archive every week, you would use a command like the following, weekly:. Note that time-stamping will only work for files for which the server gives a timestamp. Time-Stamping Usage , Up: If you wish to retrieve the file foo. If the file does exist locally, Wget will first check its local time-stamp similar to the way ls -l checks it , and then send a HEAD request to the remote server, demanding the information on the remote file.
If the remote file is newer, it will be downloaded; if it is older, Wget will give up. It will try to analyze the listing, treating it like Unix ls -l output, extracting the time-stamps. The rest is exactly the same as for HTTP. Assumption that every directory listing is a Unix-style listing may sound extremely constraining, but in practice it is not, as many non-Unix FTP servers use the Unixoid listing format because most all? Bear in mind that RFC defines no standard way to get a file list, let alone the time-stamps. We can only hope that a future standard will define this.
Another non-standard solution includes the use of MDTM command that is supported by some FTP servers including the popular wu-ftpd , which returns the exact time of the specified file.
- iphone safari überwachen;
- o2 handy überwachung?
- racing rivals hack apk iphone.
- 1 Overview.
- mein iphone 6 Plus wird ausspioniert?
- iphone orten geheim?
Wget may support this command in the future. Once you know how to change default settings of Wget through command line arguments, you may wish to make some of those settings permanent. You can do that in a convenient way by creating the Wget startup file—. Wgetrc Syntax , Previous: Startup File , Up: Failing that, no further attempts will be made. Wgetrc Commands , Previous: Wgetrc Location , Up: The variable will also be called command.
Valid values are different for different commands. The commands are case-, underscore- and minus-insensitive. Commands that expect a comma-separated list will clear the list on an empty command. So, if you wish to reset the rejection list specified in global wgetrc , you can do it with:.
Sample Wgetrc , Previous: Wgetrc Syntax , Up: The complete set of commands is listed below. Some commands take pseudo-arbitrary values. Most of these commands have direct command-line equivalents. If this option is given, Wget will send Basic HTTP authentication information plaintext username and password for all requests. Use up to number backups for a file.
Set the certificate authority bundle file to file. Set the directory used for certificate authorities. Set the client certificate file name to file. If this is set to off, the server certificate is not checked against the specified client authorities. If set to on, force continuation of preexistent partially retrieved files. Ignore n remote directory components.
With dot settings you can tailor the dot retrieval to suit your needs, or you can use the predefined styles see Download Options. Specify the number of dots that will be printed in each line throughout the retrieval 50 by default. Use string as the EGD socket file name. Set your FTP password to string. Choose the compression type to be used. Turn the keep-alive feature on or off defaults to on.
Force connecting to IPv4 addresses, off by default. Available only if Wget was compiled with IPv6 support. Force connecting to IPv6 addresses, off by default. Das Doofe ich, ich finde die PUK nicht mehr. Was kann ich machen? Auch wenn es Möglichkeiten gibt, die was kosten, ich bin auf die Nummer angewiesen und sie musst schnellstmöglich wieder funktionieren Wahrscheinlich ne neue Karte mit alter Nummer Das ist nicht schlimm, wenn es was kostet, Hauptsache die Nummer funktioniert schnellstmöglich wieder.
Kannst alternativ auch sagen, dass dir dein Handy geklaut wurde. Selbiges ist mir passiert. Anruf und gesagt, dass mir mein Handy geklaut wurde und ich dringend eine neue Karte mit der alten Nummer benötige. Nachdem ich dem Mitarbeiter im Call Center erklärt habe, worum es geht und ihm die Nummer und das Kennwort genannt habe, hab ich von ihm den PUK bekommen.
Aber mir egal, ich bin froh, dass die Nummer wieder erreichbar ist. Einige kostenpflichtig, bei anderen ist es Service.
Hab ihm dann den Sachverhalt geschildert, meine Rufnummer und das Kennwort genannt und dann hat er mir die PUK gegeben. Hallo, ich hatte genau das Problem: Also war es absolut kein Problem! Hoffe, ihr habt genauso viel Glück. Wow, Einen fast 6 Jahren alten Thread ausgraben. Wen interessiert denn jetzt noch deine Antwort. Ruf bei deinem Provider an, der kann dir weiter helfen. Bei manchen Betreibern fallen hierfür Gebühren an.
Was ist mspy?
mSpy ist einer der weltweit führenden Anbieter von Monitoring-Software, die ganz auf die Bedürfnisse von Endnutzern nach Schutz, Sicherheit und Praktikabilität ausgerichtet ist.
mSpy – Wissen. Vorbeugen. Schützen.Wie's funktioniert
Nutzen Sie die volle Power mobiler Tracking-Software!
Erhalten Sie vollen Zugriff auf die Chats und Messenger des Ziel-Geräts.
Kontaktieren Sie uns rund um die Uhr
Unser engagiertes Support-Team ist per E-Mail, per Chat sowie per Telefon zu erreichen.
Speichern Sie Ihre Daten
Speichern und exportieren Sie Ihre Daten und legen Sie Backups an – mit Sicherheit!
Mehrere Geräte gleichzeitig beaufsichtigen
Sie können gleichzeitig sowohl Smartphones (Android, iOS) als auch Computer (Mac, Windows) beaufsichtigen.
Wir von mSpy wissen jeden einzelnen Kunden zu schätzen und legen darum großen Wert auf unseren Rund-Um-Die-Uhr-Kundenservice.
Kundenzufriedenheit ist das oberste Ziel von mSpy. 95% aller mSpy-Kunden teilen uns ihre Zufriedenheit mit und geben an, unsere Dienste künftig weiter nutzen zu wollen.
mSpy macht Nutzer glücklich
In dem Moment, wo ich mSpy probiert hatte, wurde das Programm für mich als Elternteil zu einem unverzichtbaren Helfer im Alltag. So weiß ich immer, was meine Kleinen gerade treiben, und ich habe die Gewissheit, dass es ihnen gut geht. Gut finde ich auch, dass ich genau einstellen kann, welche Kontakte, Websites und Apps okay sind, und welche nicht.
mSpy ist eine echt coole App, die mir dabei hilft, meinem Kind bei seinen Ausflügen ins große weite Internet zur Seite zu stehen. Im Notfall kann ich ungebetene Kontakte sogar ganz blockieren. Eine gute Wahl für alle Eltern, die nicht von gestern sind.
Kinder- und Jugendschutz sollte für alle Eltern von größter Wichtigkeit sein! mSpy hilft mir, meine Tochter im Blick zu behalten, wenn ich auch mal nicht in ihrer Nähe sein sollte. Ich kann's nur empfehlen!
Mein Sohnemann klebt rund um die Uhr an seinem Smartphone. Da möchte ich schon auch mal wissen, dass da nichts auf schiefe Bahnen ausschert. Mit mSpy bin ich immer auf dem Laufenden, was er in der bunten Smartphone-Welt so treibt.
Ich habe nach einer netten App geschaut, mit der ich meine Kinder im Blick behalten kann, auch wenn ich nicht in der Nähe bin. Da hat mir ein Freund mSpy empfohlen. Ich find's klasse! So bin ich immer da, falls meine Kleinen auf stürmischer Online-See mal jemanden brauchen, der das Ruder wieder herumreißt.
Den Angaben nach ist die App nur für legales Monitoring vorgesehen. Und sicherlich gibt es legitime Gründe, sie zu installieren. Interessierte Firmen sollten Ihre Angestellten darüber informieren, dass die Betriebs-Smartphones zu Sicherheitszwecken unter Aufsicht stehen.