Jump to content
Българският форум за музиканти

Recommended Posts

Отговорено (Редактирано)

Представям една библия за всеки, занимавал се с микропроцесори. Запознаването с нея е основа на познанията на микрокомпютърните специалисти.


Българската серия СМ600 е копие на 6800


https://app.box.com/s/svhqtiykd4dn5wlg0lgh


 


MdXCQHE.jpg


Редактирано от Parni_Valjak
  • Like 1
Отговорено
 

Продължение на серията учебни помагала от Forest Mims III

Примерът на изображението е закачка към DIY любителите - един метод за получаване на променливо ограничение, приложимо в музикалните ефекти.

 

TsXlh7D.png

Отговорено
Introducing USB Type-C -- USB for 21st Century Systems
 
Max Maxfield, Designline Editor
1/29/2015 
 
Industry leaders are poised to start rolling out devices enabled with a new form of USB -- USB Type-C -- featuring a small, robust connector that can handle 20Gbps and 100W.

 

Have you heard about USB Type-C? If not, then you'd better prepare yourself, because this little beauty is poised to take the world by storm. The USB Implementers Forum (USB-IF) demonstrated USB Type-C at CES 2015, and industry leaders are expected to start launching USB Type-C-enabled products by mid-2015. This new USB incarnation offers so many advantages that I predict the rate of its deployment will make all of our heads spin like tops.

 

The world before USB
It seems strange to me that a lot of younger folks don't actually remember a time before the USB (Universal Serial Bus) standard appeared on the scene, so this portion of the proceedings may prove to be of interest to them. And, in the case of us "old timers," it may be fun to take a brief stroll down memory lane and remind ourselves just hard life used to be.

Let's start by considering a typical desktop or deskside computer system circa the early-1990s. Each of the peripheral devices that plugged into this machine would do so using one of a cornucopia of capriciously convoluted connectors.

 

 

5b9PCsq.jpg

 

 

On the I/O (input/output) panel on the back of the computer there would be two PS/2 connectors -- one for the keyboard and one for the mouse. Even though these connectors were physically the same, however, the devices weren't interchangeable because they used a different set of commands. The only clue you had in the early days were little mouse and keyboard icons next to the ports on the I/O panel. Unfortunate, these could be quite difficult to locate and decipher when on one's hands and knees in the gloom under one's desk. I can remember the first time I saw a computer that had color-coded plugs and sockets associated with these connectors and thinking to myself: "Wow, that's an amazingly clever idea!" (As you can tell, I was easily impressed in those days.)

Next, you would have one or two 9-pin RS-232 connectors. These were referred to as serial ports or COM ports (for communications), and they could be used to connect a variety of external devices such as scanners, plotters, external modems, and so forth. On the one hand these ports were extremely useful; on the other hand they could be a real pain, because when you connected a new device you often had to set up a load of nitty-gritty communications details, such as the number of data bits, the number of stop bits, and the speed of the interface.

But wait, there's more, because you'd almost certainly have a parallel port and/or a Centronics port to drive your printer, plus you might have a Scuzzy port to connect to external storage devices, and the list goes on. Oh, what fun we had!

Quite apart from the fact that these connectors were bulky and expensive, they -- and the systems that employed them -- were somewhat limited in their capabilities. For example, there was no such thing as a hot-plug capability in which devices could be added or removed without powering down the computer. These things were cold-pluggable, which meant that if you wanted to add or remove a device, you would have to power-down your computer, make your change, and then power everything up again.

Another big problem came when you ran out of connectors on the back of your machine. If you required an additional RS-232 port, for example, you would have to add an expansion card to your system.

 

 

YBJSC6p.jpg

 

 

Actually you typically needed to add expansion cards for all sorts of things, such as modems and sound cards and suchlike. The problem was that adding an expansion card to an ISA (Industry Standard Architecture) bus-based system was a non-trivial matter. (ISA was the precursor to the Peripheral Component Interconnect (PCI) standard.) In order to add a card, you had to remove the cover to your PC, set a number of switches and/or jumpers on the card to configure it, and then insert the card into a free slot in the system.

What? Did you think I forgot to mention replacing the cover on the PC? If only things were that simple. Generally speaking, this was the time when your problems really started. Once the system was powered up, you would typically have to load the software driver for this device from a floppy disk. Then you would have to juggle a limited number of interrupt request lines to ensure that the resources you had selected weren't already being used by another device. Adding a simple modem card, for example, could take hours -- and that was if you really knew what you were doing; for average users the whole thing was a nightmare of confusion and despair!


USB 1.0 to 3.1
In order to address all of the connector-related issues presented on the previous page, a group of seven companies got together in 1994. These companies -- who we should all thank profusely -- were Compaq, DEC, IBM, Intel, Microsoft, NEC, and Nortel. What they wanted to was to make it fundamentally easier to connect external devices to computers. In order to do this, they set about addressing the usability issues of existing interfaces and simplifying the software configuration of any devices connected to the computer, as well as permitting greater bandwidths for external devices.

The result, of course, was the Universal Serial Bus, or USB for short. USB 1.0 was released in January 1996, but there were a number of "glitches" and "gotchas," with the result that few USB 1.0 devices actually made it to the market. USB 1.1 was released in September 1998. This release fixed the problems in the 1.0 version and was the earliest version to be widely adopted.

The first time I personally saw a USB connector was on the back of a tower computer I purchased in the summer of 1998. Unfortunately, these connectors didn't work for some time until a software patch became available. On the bright side, this really didn't affect me too much because -- at that time -- I didn't have any USB-enabled devices to plug into the computer anyway. Of course, this was soon to change, and it wasn't long before USB products of all shapes and sizes were to be found strewn all around my office.

Now, if there's one thing we know for sure, it's that we have an ever-increasing demand to move more and more data around. USB 1.x specified data rates of 1.5 Mbps (megabits per second; known as Low Speed) and 12 Mbps (known as Full Speed). These bandwidths were OK for less-demanding applications -- like mice and keyboards and even printers and scanners -- but they were painfully slow when it came to transferring larger chunks of data.

This led to the USB 2.0 specification, which was formally standardized by the USB-IF toward the end of 2001. In addition to adding new transfer mechanisms and other techno-weenie details, USB 2.0 also augmented the existing speed variants with a higher data transfer rate of 480 Mbps (High Speed), which was a 40-fold increase over the Full Speed maximum bandwidth offered by USB 1.x.

The advantages associated with USB are difficult to overstate. I don't know about you, but I can no longer envisage a world without things like USB memory sticks (I find it hard to believe that an RS-232-based memory stick would have the same appeal). The fact that USB-enabled devices are hot-pluggable has saved the human race countless hours that would otherwise have been spent fruitlessly powering computers down and up again. Also, when you connect a new USB device to your computer, it can either use an existing driver or it will automatically download the required driver over the Internet.

If you run out of USB ports on your computer, you can simply plug in a USB hub (port expander) and off you go again. And there's also the fact that USB cables can be used to power and/or charge peripheral devices like digital cameras and MP3 players.

Now, before we proceed any further, this is probably a good time to consider USB cables. In the case of USB 1.x, we started out with two types of connectors called Type-A (which plugs into the computer) and Type-B (which plugs into the peripheral). As a point of interest, observe that the power contacts are longer than the data contacts, thereby ensuring that a USB-enabled device starts to receive power before the data pins make contact.

 

V7GfWvd.jpg

Some devices, like a mouse or keyboard, are equipped with a cable coming out of them, where this cable boasts a Type-A connector. Other devices, like a printer, require a cable with a Type-A connector on one end and a Type-B connector on the other.

Now, if I might make so bold, I fear that the creation of Type-A and Type-B connectors is where the creators of USB started to get a tad over-enthusiastic and went a little awry. First of all, these connectors are polarized, which means you have to work out which side is the "top" before plugging them in. I cannot tell you how many times I've tried to plug one in the wrong way over the years, which has resulted in much gnashing of teeth and rending of garb, let me tell you.

And why do we need two different connectors in the first place? Why not stick a Type-A plugs on both ends of the cable and have Type-A sockets on both the computer and the printer (or other peripheral device)?

Personally, I think this was where the USB designers over-engineered things. I believe they worried that the rest of us were so stupid that we might plug both ends of the cable into two sockets on the same PC -- or even two sockets on different PCs. In reality, the chances of this were pretty slight, and -- even if some drongo did end up doing this -- it would not have been beyond our capabilities to design the systems in question such that they could detect the fact that there was a problem, disable the ports, and inform us as to our mistake.

Oh well, "what's done is done," as they say. It soon became obvious that the Type-A and Type-B connectors were too large for certain applications, and that more compact versions were required for use with smaller devices such as personal digital assistants (PDAs), mobile phones, digital cameras, and so forth.

These newer versions include the Mini-A and Mini-B and Micro-A and Micro-B connectors as illustrated below.

6I9LuHu.jpg

 

Observe that both the Mini and Micro connectors have five pins (unlike the Standard A and B connectors, which have four). This extra ID pin permits distinction of an A plug from a B plug. In the case of an A plug, the ID pin is connected to signal ground; in the case of a B plug, the ID pin is left unconnected. This capability is used to support things like USB On-The-Go (OTG), in which a device can assume the role of a Host (acting as the link master) or a Peripheral (acting as the link slave).

While we're here, we should also mention the concept of Wireless USB (unofficially known as WUSB), which is a short-range, high-bandwidth wireless radio communication protocol created by the Wireless USB Promoter Group at the USB Implementers Forum. Wireless USB may be of interest in applications like game controllers, printers, scanners, digital cameras, MP3 players, and hard disks and flash drives.

But we digress...

November 2008 saw the release of the USB 3.0 standard, which can support a SuperSpeed mode offering 5 Gbps. The really cool things about this is that a USB 3.0 port -- which is usually colored blue -- is backwards-compatible with USB 2.0 devices and cables. The way this works is via a dual-bus architecture that allows USB 1.x/2.0 (Low Speed, Full Speed, or High Speed) and USB 3.0 (SuperSpeed) operations to take place simultaneously. Take a look at the USB 3.0 Standard-A connector shown below (USB 3.0 Standard-B and Micro-B connectors are also available):

 

 

TYDMLs9.jpg

 

 

Observe the front row of four pins providing USB 1.x/2.0 backwards-compatibility; these are augmented by a second row of five pins that support USB 3.0 connectivity.

In July 2013, the USB 3.1 specification was released. By means of a new encoding scheme -- coupled with enhanced, fully-backwards-compatible versions of the same cables and connectors as USB 3.0 -- USB 3.1 doubles the maximum bandwidth to 10 Gbps. This new transfer mode is officially referred to as "SuperSpeed USB 10 Gbps" or "SuperSpeed+" for short.

All of which means that we've now set the scene for USB Type-C to make its grand entrance (cue roll of drums and fanfare of trumpets)...

 

USB Type-C
There several different aspects to USB Type C, including the physical connector itself, a much more sophisticated power delivery scheme, and support for flexible new communication modes.

Let's start with the 24-pin connector, which is both small (3mm high and 8mm wide) and robust (it's rated for 10,000 mate/de-mate cycles).

 

hqOPF2x.jpg

 

 

A few of the key features associated with USB Type-C cables and connectors are as follows:

  • The connector is non-polarized -- it plugs in either way -- no longer do we have to dork around trying to determine "Which side goes up?" 
     
  • The connecter is small enough that the same connector can be used everywhere -- on workstations, tablet computers, MP3 players, smartphones, digital cameras, etc. 
     
  • Unlike the vast majority of other USB cables, Type-C cables have the same male connector on both ends -- it's up to the things they are plugged into to "negotiate" with each other to determine who is in charge of doing what. 
     
  • The specification supports data bandwidths up to 20 Gbps and facilitates alternate, non-USB, vendor-defined modes (you'll need the right type of cable to support these higher bandwidths and advanced modes as discussed below). 
     
  • The specification supports power delivery of up to 100W for faster charging (you'll need the right type of cable to support the more advanced power delivery modes and higher power levels as discussed below). 
     
  • In the case of the simpler power delivery and data transmission modes, passive (unintelligent) cables may be used. When it comes to the more advanced modes, intelligent cables will be required, where such cables contain an electronic ID that can inform the other elements in the system as to that cable's power capacity and the data bandwidths it can handle.

 

Now let's consider the pinout and primary signal assignments for the USB Type-C connector in a little more detail as illustrated below:

 

 

a6fmNqz.gif
(Click here to see a larger image.)

 

The roles of the Vbus (power) and GND (ground) pins are reasonably self-explanatory, or so it would at first appear. In reality, things are a little more complex here than you might expect, but we'll return to this in a moment. First, we need to define a few terms, including Downstream-Facing Port (DFP), which we used to refer to as the host, and Upstream-Facing Port (UFP), which we used to think of as the peripheral device. There's also the concept of a Dual Role Port (DRP), which may be configured to act as a DFP or a UFP, and which may be dynamically switched back and forth between the two.

The D+ and D- pins are used to support legacy USB 2.0 devices. All that is required in this case is for the DFP to supply 5V on the Vbus pins and "Bob's your uncle" (or aunt, as the case may be).

Returning to the Vbus (power) pins, it used to be that the DFP always supplied power to the UFP; now, everything is up for grabs. Consider a USB Type-C-enabled tablet connected to a USB Type-C-enabled television, for example. In this case, the tablet may end up transmitting video data to the television, while the television ends up supplying power to the tablet. Alternatively, if a USB Type-C-enabled tablet were connected to a USB Type-C-enabled smartphone, the tablet may end up supplying the power while the smartphone returns video data for display.

Initially, the DFP will supply 5V to the Vbus pins (i.e., the same as USB 2.0 and USB 3.x). USB 2.0 can supply at most 500mA per port, while USB 3.x boosted this up to 900mA per port. USB Type-C can support up to 100W per port (20V at 5A). The actual voltage and power that can be supplied by the DFP (or UFP), conducted by the cable, and accepted by the UFP (or DFP) is determined by the DFP and UFP, which negotiate to agree on the power delivery (PD) scheme; i.e., who is going to supply the power, who is going to receive the power, what voltage is to be used, and how much current will be made available.

The end result is that -- depending on its capabilities -- sometimes a device may transmit both power and data; sometimes it may receive both power and data; sometimes it may transmit one and receive the other; and sometimes it may dynamically switch between all of the different possibilities. Once again, this is all subject to negotiation between the various devices.

Next, let's consider the two sets of high-speed TX/RX signals. Each of these is capable of supporting the USB 3.1 SuperSpeed+ standard of 10 Gbps, which means USB Type-C can support up to 20 Gbps of raw data. Furthermore, these signals can be configured to support alternative modes and convey various flavors of non-USB data, such as video.

We've also got the SBU1/2 sideband signals, which can be configured to transmit whatever data the DFP and UFP can agree upon; perhaps an audio stream, for example. The bottom line is that six of the USB Type-C signals (four TX/RX pairs and two sideband signals) -- highlighted in yellow in the illustration below -- can be configured to support alternative data modes as required (and as negotiated between the DFP and UFP).

 

 

zqxH7nK.gif
(Click here to see a larger image.)

 

 

Several times in our earlier discussions we've mentioned the concept of the DFP and UFP negotiating with each other. Let's consider this concept in a little more detail and -- in order to do so -- let's start with a high-level view of a DFP (host) connected to a UFP (peripheral device) via a USB Type-C cable as illustrated below.

 

 

su7Q03v.gif
(Click here to see a larger image.)

 

Observe that there's only one CC conductor in the cable itself; the Vcon pins at either end are connected to ground via Ra (pull-down) resistors in the cable. The Rp (pull-up) resistors in the DFP have different values to the Rd (pull-down) resistors in the UFP, and both types are different to the pull-down resistors in the cable. These differences allow the DFP and UFP to determine who is who and which way round the cable has been plugged into each of them.

Simple USB Type-C cables will be passive, which means they look like the one illustrated above. In order to support the more advanced power and data communication modes, some cables will also be equipped with an electronic ID that can inform the system as to that cable's power capacity and the data bandwidths it can support.

Here's a highly-simplified view of how all of this works. First of all, you plug the USB-Type C cable into both devices (or only one device in the case of something like a keyboard, which has the cable attached). We now enter the Cable Detect (CD) phase in which -- by means of the various pull-up and pull-down resistors -- the DFP and UFP determine which way round the cable is plugged into them and who is doing what to whom. Once the DFP has determined which pins on the cable are its CC and Vcon pins, it applies a 5V signal to the CC pin. This is used to power the electronic ID in the cable (if it has one). In turn, the electronic ID in the cable will transmit data on the Vcon pin describing things like its power capacity and data bandwidths.

The next step focuses on Power Delivery (PD), in which the DFP and UFP talk to each other to decide who will be supplying power and who will be receiving it. Based on this -- and on the capabilities of the device supplying the power and the ability of the cable to convey it -- the DFP and UFP will come to an agreement regarding the voltage to be used and the current to be made available, and they will then proceed accordingly.

The DFP and UFP will also negotiate with regard to the types of data to be transmitted over the high-speed signal channels. In addition to regular USB protocols, the standard also incorporates the concept of Vendor-Defined Messages (VDM), which can support non-USB applications such as PCIe, VGA, HDMI, DP, etc. Furthermore, the sideband channels can also be vendor-defined (e.g., audio).

So, to summarize USB Type-C in a nutshell, we have a single small, rugged, non-polarized connector -- along with associated cables and sub-systems -- that can support up to 100W for faster charging and up to 20 Gbps for faster data transmission. The whole shebang is extremely flexible and can support additional vendor-defined modes like audio and video. And, just in case you were wondering, we can expect a host of little dongles to appear that will allow USB Type-C devices and cables to interface with their USB 2.0/3.x counterparts (Type-C devices will "talk down" to whatever capabilities are supported by the legacy devices).

FPGAs to the rescue!
I'm sure you'll agree with me that the whole USB Type-C concept sounds extremely exciting. However, there are a few issues to consider, not the least that standards like USB Type-C tend to mutate and evolve over time.

Furthermore, at the present time, existing PHY devices, microcontrollers (MCUs), and application processors (APs) don't include the hardware necessary to support the critical functions required to unlock the power of USB Type-C interfaces, including things like Cable Detect (CD), Power Delivery (PD), SuperSpeed+ Switch (SS) control, and Vendor-Defined Messaging (VDM).

All of this cries out for an FPGA-based solution. One FPGA vendor that is really leaping into this arena with gusto and abandon is Lattice Semiconductor. The folks at Lattice have been putting a lot of thought and effort into implementing USB Type-C, and have generated all sorts of useful intellectual property (IP) blocks for use with their FPGAs.

Let's consider one of the simplest possible scenarios. As soon as USB Type-C-enabled devices like tablets, smartphones, MP3 players, and digital cameras start to appear on the scene, one of the first products we can expect to see are appropriate chargers.

 

 

LZB3Igo.jpg

 

These chargers will need to be able to take advantage of the new Cable Detect (CD) and Power Delivery (PD) functionality in order to negotiate a power contract that best utilizes the charger's capabilities and meets the device's requirements. Once such a contract has been negotiated, the PD function will be used to control a Power Management Integrated Circuit (PMIC), which will provision the negotiated current and voltage.

 

 

DMgP3NI.jpg

 

Remember that this is one of the simpler scenarios. Since chargers and power supplies do not need access to high-speed data streams, there is no need to include the switch control logic for these signal paths in this design. The folks at Lattice have a whole slew of application examples that encompass much more sophisticated designs, including switch control logic for high-speed data streams and Dual Role Port (DRP) support.

To be honest, most of what I know about USB Type-C I learned from Gordon Hands -- Director of Marketing (New Initiatives) at Lattice Semiconductor. Don’t let the marketing title fool you; in addition to a Master's degree in Business Administration, Gordon also holds a Bachelor's degree in Engineering (he's one of us, not one of them).

In fact, Gordon was kind enough to say that he would be happy to answer any questions we may have with regard to using Lattice's FPGAs to implement USB Type-C functionality in our products, and that we can contact him at Gordon.Hands@latticesemi.com (he may live to regret this).

So, having read all of the above, what do you think about USB Type-C? Personally, I cannot wait. I'm sure that the initial transition period will have its interesting moments, but my mind is focused on how wonderful things are going to be in the not-so-distant future. Please share your thoughts in the comments below.

 

  Max Maxfield, Editor

 

  • Like 2
Отговорено (Редактирано)

Информацията за лампите рожба на Доктор Зигмунд Льове, която ditronix ми изпрати е във форма на HTML страница и не е подходяща за представяне във формата на форума. Все пак ще дам адреса на публикацията, за да може да се прегледа от желаещите.

http://www.jogis-roe...ehren/Loewe.htm

 

Тук помествам една статия, която разглежда споменатия феномен и може да го осветли достатъчно информативно.

 

Лампи в лампи

 

0c0QJHW.png

Редактирано от Parni_Valjak
  • Like 1
Отговорено (Редактирано)

Pierre-Simon Laplace and the Laplace transform


Br7HIfN.jpg



Characteristic of his time, Pierre-Simon Laplace (1749-1827) published a comprehensive review of the scientific work done by his predecessors with an overlay of his new mathematical interpretation. Titled Celestial Mechanics, the five-volume work replaced the old-style geometric methodology with Laplace’s unique style of calculus, suggesting a broad range of new subject matter – everything from stability of planetary orbits to a study of probability, applicable to the European gaming table.


For centuries, the language of mathematics had been geometry, the study of shapes. Shortly before the time of Laplace, Gottfried Leibniz and Isaac Newton, more or less simultaneously and independently, had introduced into scientific discourse a new perspective: calculus, the language of change. As geometry is to space, calculus is to time.


Differential calculus takes as its subject matter rates of change and slopes of curves, while integral calculus is concerned with accumulation of quantities and the areas shown graphically that are under or between curves. An infinite sequence or infinite series is seen in either discipline to converge at the region of a definite limit.


Laplace, in this context, formulated the equation named for him, the Laplace theorem and the Laplacian differential operator. The Laplace equation is an elliptic partial differential equation. Its solution gives rise to harmonic functions, which play a central role in electrodynamics, describing gravitational and fluid potentials as well.


IH8HY7F.jpg

The formal definition of a Laplace Transform of a function f(t).



The Laplace transform is similar to the Fourier transform, discussed in a previous article. But rather than expressing a function as a superposition of sine waves, it expresses a function as a superposition of moments. Both the Fourier and Laplace transforms permit the translation of the time domain to the frequency domain. In an oscilloscope, rather than by means of difficult computation, this transformation can be achieved by pressing a button.


The Laplace transform is a powerful tool that, in changing from the time domain to the frequency domain, shifts the terms of the inquiry from differential to algebraic equations.


Laplace had a wide-ranging intellect. He investigated diverse topics other than celestial mechanics, electrostatics and electromagnetism. He became interested at various times in probability theory, politics and theology. He was among early thinkers who propounded the idea of a black hole. With no observational evidence, he envisioned a body in space that would have such immense density and gravity that no light could escape, so that it became invisible to the outside world.


Редактирано от Parni_Valjak
  • Like 1
Отговорено (Редактирано)

openaccess34.png

Open
Access

AES E-Library

Software Techniques for Good Practice in Audio and Music Research

 

ABSTRACT

 

In this paper we discuss how software development can be improved in the audio and music research community by implementing tighter and more effective development feedback loops. We suggest first that researchers in an academic environment can benefit from the straightforward application of peer code review, even for ad-hoc research software; and second, that researchers should adopt automated software unit testing from the start of research projects. We discuss and illustrate how to adopt both code reviews and unit testing in a research environment. Finally, we observe that the use of a software version control system provides support for the foundations of both code reviews and automated unit tests. We therefore also propose that researchers should use version control with all their projects from the earliest stage.

Authors: Figueira, Luis; Cannam, Chris; Plumbley, Mark
Affiliation: Queen Mary University of London, London, UK
AES Convention:134 (May 2013) Paper Number:8872 
Publication Date:May 4, 2013 
Subject:Perception and Education

 

 
Редактирано от Parni_Valjak
Отговорено (Редактирано)

openaccess34.png

Open
Access

AES E-Library

PARAMETRIC EQUALIZATION

by
George Massenburg
ITI Audio Products/ITIRI Studios
Cockeysville, Maryland
 
PRESENTED AT THE
42nd CONVENTION
MAY 2-5, 1972

 

ABSTRACT

 
This presentation concerns the application of new equalization techniques to professional audio control. The device utilized is a parametric equalizer which:
1) offers vernier control of frequency
and amplitude, and coherent control of "Q" or shape,
2) is suitable
for automatic voltage control, and
3) improves transient and phase
response by the use of all-active RC circuitry which also eliminates
parasitics.
 
Редактирано от Parni_Valjak
  • Like 2
Guest
Темата е заключена и Вие нямате право да коментирате в нея.
×
×
  • Създай нов...

Важна информация!

Поставихме "бисквитки" на вашето устройство, за да направим този сайт по-добър. Можете да коригирате настройките си за "бисквитките" , в противен случай ще предположим, че сте съгласни с тяхното използване.