Dissertações em Geofísica (Mestrado) - CPGF/IG
URI Permanente para esta coleçãohttps://repositorio.ufpa.br/handle/2011/4993
O Mestrado Acadêmico pertente a o Programa de Pós-Graduação em Geofísica (CPGF) do Instituto de Geociências (IG) da Universidade Federal do Pará (UFPA).
Navegar
Navegando Dissertações em Geofísica (Mestrado) - CPGF/IG por Orientadores "LEITE, Lourenildo Williame Barbosa"
Agora exibindo 1 - 19 de 19
- Resultados por página
- Opções de Ordenação
Item Acesso aberto (Open Access) Análise de velocidade por otimização do semblance na reflexão sísmica(Universidade Federal do Pará, 2010) VIEIRA, Wildney Wallacy da Silva; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This work had as a general aim to develop a systematic methodology for the inversion of seismic reflection data organized in common-midpoint gathers (CMP), starting from 1D vertical variation of velocity and thickness that allows to obtain interval velocity, vint,n, in time, the correspondent interval thickness, zn, and the correspondent mean-square velocity, vRMS,n in individualized CMP gathers. A direct consequence of this work the transformation of these values from time to depth. Two methods were developed to attack the problem defined as velocity analysis based on the estimation of interval velocity. The first method was based on manual picking of reflection events on CMP gathers, and inversion by curve fitting in the least-square sense. The second method was based on the otimization of the semblance function to obtain an automatic picking. The methodology combined two types of optimization: a Global Method (Price or Simplex), and Local Method (second order gradient or cojugate), subject to a priori information and constraints. The picking of events in time-distance section is of fundamental importance in the process of inversion, and the picked points are the input data along with a priori information of the model to be adjusted. The picking must, in principle, avoid events that represent multiples, diffractions and intersections, and in a section over 50 pickings can be made, while in a semblance map not more than 10 events could usually be picked by eye. The application of this work is focused on seismic data of marine sedimentary basins to obtain a distribution of velocities for the subsurface, where a plane-horizontal model is applied for individual CMP sections, and that the solution can be used as an initial model in subsequent processes. The real data used in this study were collected by Petrobras in 1985, and the selected seismic line was of number L5519 of the Camamu Basin, and the CMP presented is of number 237. The line consists of 1098 shot points with right-lateral arrangement. The sampling interval is 4 ms. The spacing between the geophones is 13.34 m with the first geophone located at 300 m from the source. The spacing between the sources is 26.68 m. As a general conclusion, the method for estimating interval velocity in this work stands as an alternative support to velocity analysis, where it is necessary a control over the sequential inversion of CMP gathers along the seismic line such that the solution can be used as an initial model for imaging, and further tomographic inversion. As future work, we can be propose studies directely and specifically related to seismic velocity analysis by extending the 2D semblance optimization method to 3D, and extending the present studies to the method based on the image ray, aiming at producing a continuous velocity map for the entire section in an automatic way.Item Acesso aberto (Open Access) Aplicação de deconvolução homomórfica a dados sísmicos(Universidade Federal do Pará, 1998) GOMES, Maria de Valdivia Costa Norat; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617A seismic record is often represented as the convolution of a wavelet with the impulse response relative to the reflection path. The process of separating these two components of the convolution is termed deconvolution. There are a number of approaches for carrying out a deconvolution. One of the most common is the use of linear inverse filtering, that is, processing the composite signal through a linear filter, whose frequency response is the reciprocal of the Fourier transform of one of the signal components. Obviously, in order to use inverse filtering, such components must be known or estimated. In this work, we deal with the application to seismic signals, of a nonlinear deconvolution technique, proposed by Oppenheim (1965), which uses the theory of a class of nonlinear systems, that satisfy a generalized principle of superposition, which are termed homomorphic systems. Such systems are particularly useful in separating signals which have been combined through the convolution operation. The homomorphic deconvolution algorithm transforms the convolutional process into an additive superposition of its components, with the result that the single parts can be separated more easily. This class of filtering techniques represent a generalization of linear filtering problems. This method offers the considerable advantage that no prior assumption about the nature of the seismic wavelet or the impulse response of the reflection path need be made, that is, it does not require the usual assumptions of a minimum-phase wavelet and a random distribution of impulses, although the quality of the results obtained by the homomorphic analysis is very sensible to the signal/noise ratio, as demonstrated.Item Acesso aberto (Open Access) Aplicação do método de Kalman a dados geofísicos(Universidade Federal do Pará, 1998-03-03) ROCHA, Marcus Pinto da Costa da; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The Kalman filter is applied to the inverse filtefing or deconvolution problem. In this dissertation we applied the Kalman method, it is considered like a processament vition on time domain, to separet signal-noise within sonic perfil which is admited like no stationary stochastic process. In next work will survey deconvolution problem. The derivation given of the Kalman filter emphasizes the relationship between the Kalman and Wiener filter. This derivation is based on the modeling of randon processes as the output of linear systems excited by white noise. Ilustrative results indicate the applicability of these tchniques to a variety of geophysical data processing problems, for example the ideal well log teated here. The Kalman filter offters exploration geophysicists addtition insight into processing problem modeling and solution.Item Acesso aberto (Open Access) Avaliação do efeito de janela e descoloração nos filtros Wiener-Hopf(Universidade Federal do Pará, 1999-05) ALVES, Fábio José da Costa; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The present masters dissertation consists of studies on seismic deconvolution where we look for otimizing the operations of smoothing, of resolution for the estimation of distribution of reflection coefficients, and of recovery of source-pulse. The studied filters are single channel, and the formulations consider the seismic trace as the result of a stationary stochastic process, and we demonstrate the effects of taper windows and of prewhitening in resolution. The applied principle is the minimization of the difference's variance between real and desired outputs, resulting on a system Wiener-Hopf normal equations whose solution is the vector of filter coefficients to be applied in a convolution. Spike deconvolution is designed considering the distribution of reflection coefficients as a white series. The operator compresses the seismic events to impulses, and its inverse is a good approximation to the source-pulse. The application of taper windows and of prewhitening improve the output of this filter. Spike-series filters are designed using the distribution of reflection coefficients. The statistical properties of the reflection coefficients distribution affect the operator and its performance. Taper windows on autocorrelation degrades the output, and improvement is achieved when it is applied to the deconvolutional operator. The Hilbert transform produces good results in the recovery of source-pulse, under the premise of minimum-phase. The inverse of the recovered source-pulse compresses the seismic events to impulses. When the seismic trace contains additive noise, the results obtained with Hilbert transform are better than with spike deconvolution. Smoothing filter suppresses noise in the seismic trace as a function of a prewhitening parameter. The use of smoothed traces improves the spike deconvolution. Double prewhitening generates better results than the simple prewhitening. The matched filter operator is obtained from maximization of a signal/noise ratio function. Deconvolving the output of the matched filter for the estimation of the distribution of reflection coefficients possess better resolution than when using a smoothing filter.Item Acesso aberto (Open Access) Comparação dos filtros de velocidade e do operador WHLP-CRS na atenuação de múltiplas(Universidade Federal do Pará, 2004-04-20) CRUZ, Edson Costa; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The geological motivation of this work is the imaging of sedimentary basin structures of the Amazon region, where the generation and accumulation of hydrocarbons is related to the existence of diabase sills. The seismic motivation is the fact that these intrusive rocks present a great impedance contrast with respect to the host rock, what gives rise to external and internal multiples, with primary-like amplitudes. The seismic signal of the multiples can predominate over the primary reflection signals from deeper interfaces, making difficult the processing, interpretation and imaging of seismic sections. In this work we study the attenuation of multiples in common shot (CS) sections by the comparison of two methods. The first one is the combination of the Wiener-Hopf-Levinson (WHLP) and the common reflection surface (CRS) stacking techniques, here called WHLPCRS, where the operator is exclusively designed in the space-time domain. The second method is a velocity filter (ω-k), applied after the CRS stacking, where the operator is exclusively designed in the frequency-wavenumber domain. The identification of the multiples is performed on the zero-offset (ZO) section simulated by the CRS stacking, using the periodicity between primaries and its multiples. The wavefront attributes, obtained through CRS stacking, are inserted on movable space-time windows, used to calculate the WHLP-CRS operator. The ω-k filter calculations are performed in the frequency-wavenumber domain, where the events of interest are selected for cutting or passage. The ω-k filter is classified as a cut-off filter, with amplitude alteration and preservation of phase, the limits of it are imposed by a space-time sampling. In practical aspects we conclude that for the case of multiples, separated events on the x-t domain are not necessarily separated on the ω-k domain, which raise difficulties in the designing of a ω-k operator with a similar performance when compared to the x-t operator.Item Acesso aberto (Open Access) Delineamento do pé do talude na margem continental do Ceará através da integração de dados geológicos e geofísicos(Universidade Federal do Pará, 1992-12-30) CAMPOS, Luiz Gonzaga; EL-ROBRINI, Maâmar; http://lattes.cnpq.br/5707365981163429; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The methodology of integrated interpretation applied to the geological and geophysical data observed along a profile at the continental margin of Ceará, enables the identification and integration of characteristics particular to each kind of datum. In this manner, it is possible to define the most probable location of important structural features, such as the boundary line between the continental and oceanic crust and the foot of the continental slope, which is the subject of this study. According to Article 76 (paragraph 4, item b) of the United Nations Covenant regarding the Law of the Sea, the foot of the continental slope is defined as the point of maximum variation of the gradient of the slope at its base. However this definition, despite its simplicity in a physiographic context, it is not sufficient to define the location of the foot of the slope, according to the Covenant, and for this reason geophysical methods are used. Within the geophysical-geological context, a quantitative interpretation of the free-air anomalies, can define the geophysical model, which represents the sub-surface, and can aid the integrated interpretation of the above mentioned data. An automatic procedure of curve adjustment, combining the inversion techniques of systematic search and gradients, was used to generate the geophysical model. The previous rigorous application of constraints and the constant re-evaluation of these constraints by means of an interactive process between seismics and gravimetrics, generated during the quantitative interpretation of free-air anomalies, constrained the final geophysical model to be within the geological framework of the area and within Airy's theory of isostatic equilibrium. The purpose of this research is to study the geological and geophysical characteristics observed on a profile at the continental margin of Ceará (LEPLAC III), especially at the foot of the continental slope, and to try to establish a methodology of integrated interpretation of these data, whose objective is to define in a systematic way, the most probable location for this physiographic feature. The methodology used turned out to be very efficient for the location of the foot of continental slope. In this sense, it was possible to integrate: (i) its physiographic location (distance from the coast and the water depth); (ii) a zone of tectonic instability inferred by the faulting, very common in the continental slope; (iii) the end of a disturbed magnetic anomaly zone, and that possibly delimits the beginning of magnetic quite zone, named anomaly E and (iv) a point of inflexion in the curve or free-air anomaly, associated to an gravimetric effect of contrast in the densities of the continental crusts, the sediment and the sea water. It was also possible to define the most probable location for the boundary line between the continental and oceanic crusts. Because of the rigorous application of the inversion techniques and constraints used, it is possible that the correlation of the intrinsic characteristics for each type of datum, performed at the conclusion of this research, have some foundation and can be confirmed if the methodology described is applied in a greater number of profiles.Item Acesso aberto (Open Access) Detecção de refletores sísmicos por rede neural discreta(Universidade Federal do Pará, 1999) FERREIRA, Alexandre Beltrão; ANDRADE, André José Neves; http://lattes.cnpq.br/8388930487104926; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The artificial neural networks have proven to be a powerful technique in solving a wide variety of optimization problems. In this work, we develop a new recurrent network, with no self-feedback loops, and no hidden neurons, for seismic signal processing, where this neural network gives the true polarity, reflectors location and magnitude estimations. The main characteristic of this neural network is the use of a type of activation function which permits three possible states of neurons, to estimate the position of the seismic reflectors in such way to reproduce its true polarities. The basic idea of this new neural network type, denominated here by discrete neural network (DNN), is to relate a cost function, that describes the geophysical problem, with the Liapunov function, that describes the neural network dynamics. In this way, the dynamics of the network leads to a local minimization of the Liapunov function, and will consequently lead to a minimization of the cost function. Thus, with a convenient output signal codification of the neural network a geophysical problem solution is obtained. The operational evaluation of this neural network architecture is performed with synthetic data obtained through the simple convolutional model and seismic ray theory, and its behavior explained with additive noise in the data with minimum, maximum and mixed phase time source pulses.Item Acesso aberto (Open Access) Dispersão das ondas de Rayleigh na Plataforma Sulamericana(Universidade Federal do Pará, 1989-05-10) SANTA ROSA, Antonio Nuno de Castro; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This work represents a dispersion study of the vertical component of the Rayleigh wave with trajectories on the South American platform. The records used are from stations located in the Brazilian territory such as the ones from Rio de Janeiro (RDJ), Caicó (CAI), Brasília (BDF) and Belém (BEB). These are the only seismological stations in Brazil with long period sensors that can be useful for the dispersion studies in the interval 4 to 50 seconds. The earthquakes used are localized along the eastern part of Andean mountain belt and inside the South American platform, with tipically continental trajectories. 34 events were selected using the following pratical criteria: the localization, the magnitude mb, the depth, and that they occured in the period of January 1978 to June 1987. The dispersion study done here means the determination of group velocities and spectral amplitudes, corresponding to the fundamental and first higher modes. Normally the modes of second order or more are rarely observed. Two types of measurements were made: (i) group velocity versus period and (ii) spectral amplitude versus period. Dispersion studies are important for determining the structure of the crust and upper mantle which are directly related to gelogical phenomena. In this work, regionalization is defined as the identification of different forms of dispersion curves and their relation to the epicenter-station trajectory on the South American platform. We also correlate them geologicaly, as described in item 4.3 of this work. The distribution of epicenters occurs from the extreme South of Argentina to the extreme North of Venezuela, with the objective of iniciating with this work a sistematic study of the sub-Andean area in our institution. Three distinct types of curves were observed in 27 trajectories and grouped in three families (1, 2 and 3). Their forms were compared with the regional geological features of South American platform. The multiple filter technique was used to obtain the dispersion curves (Dziewonski et al 1969). This filter has the property of separating the modes through their group velocities for each frequency selected and also recovers the espectral amplitudes characteristic for each harmonic (Herrmann, 1973). The theorectical development of the filter properties as well as the limitations and aplicabilities are treated by Dziewonski et al (1972). As part of this work there was the implementation, adaptions and the layout of fluxograms of the multiple filter software, as well as the digitalization procedure for data processing and non-automatic interpretation of the results.Item Acesso aberto (Open Access) Espalhamento geométrico em modelos plano-estratificados(Universidade Federal do Pará, 2004) PESSOA, Márcio Marcelo da Silva; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The measurement of physical parameters of reservoirs is of great importance to the detection of hydrocarbons. To obtain these parameters, an amplitude analysis is performed with the determination of the reflection coefficients. For this, it is necessary the application of special processing techniques able to correct the spherical divergence effects on seismic time sections. A problem can be established through the following question: What is the relatively more important effect responsible for the amplitude attenuation: geometrical spreading or the loss by transmissivity? A justification for this question resides in that the theoretical dynamic correction applied to real data aims exclusively to the geometrical spreading. On the other side, a physical analysis of the problem by different directions places the answer in conditions of doubt, what is interesting and contradictory with the practice. A more physically based answer to this question can give better grounds to other works in progress. The present work aims at the calculus of the spherical divergence according to the Newman-Gutemberg theory, and to correct synthetic seismograms calculated by the reflectivity method. The test model considered is crustal in order to have critical refraction events besides reflection events, and to better position with respect to the time window for application of the spherical divergence correction, which results in obtaining the denoted “true amplitudes”. The simulated medium is formed by plane-horizontal, homogeneous and isotropic layers. The reflectivity method is a form of solution of the elastic wave equation for this reference model, what makes possible an understanding of the structured problem. To arrive at the obtained results, synthetic seismogram were calculated by using the fortran program P-SV-SH written and supplied by Sandmeier (1998), and reflection geometrical spreading curves as function of time were calculated as described by Newman (1973). As a conclusion, we have demonstrated that from the model information (velocities, thicknesses, densities and depths) it is not simple to obtain an equation for geometrical spreading correction aiming at the true amplitudes. The major aim would then be to obtain a panel of the spherical divergence function to correct for true amplitudes.Item Acesso aberto (Open Access) Inversão de dados de sísmica de refração profunda a partir da curva tempo-distância(Universidade Federal do Pará, 1990) CRUZ, João Carlos Ribeiro; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The aim of this thesis is to obtain crustal model through the inversion of deep seismic refraction data, considering lateraly homogenous horizontal plain layers over a half-space. The direct model is given by analitic expression for the travel-time curve, as a function that depends on the source-station distance and on the array of parameters, formed by velocity and thickness of each layer. The expression is obtained from the trajectory of the seismic ray by Snell's Law. The calculation of the arrivel time for seismic refraction by this method, takes into account a model with velocities increasing with depth. The occurrence of low velocity layers (LVL) are solved as a model reparametrization, taking into account the fact that top boundary of the low velocity layer is only a reflector, and not a refractor of seismic waves. The inversion method is used to solve for the possible solutions, and also to perform an analysis about the ambiguity of the problem. The search region of probable solutions is constrainted by high and lower limits of each parameter considered, and by high limits of each critical distance, calculated using the array of parameters. The inversion process used is an optimization technique for curve fitting corresponding to a direct search in the parameter space, called COMPLEX. This technique presents the advantage of using any objective function, and as being practical in obtaining diferent solutions for the problem. As the travel-time curve is a multi-function, the algorithm was adaptaded to minimize several objective funtions simultaneously, with constraints. The inversion process is formulated to obtain a representative group of solutions of the problem. Afterwards, the analysis of ambiguity is made by Q-mode factor analysis, through which is possible to find the commom properties of the group of solutions. Tests with synthetic and real data were made having as initial aproximation to the inversion process, the velocity and thickness values calculated by the straightforward visual interpretation of the seismograms. For the synthetics, it was used seismograms calculated by the refletivity method, with diferent models. For test with real data, it was used seismograms colected by the Lithospheric Seismic Profile in Britain (LISPB), in the northern region of Britain. It was verified in all tests that geometry of the model has most importance for the ambiguity of the problem, while the physical parameters present only smaller changes into the group of solutions obtained.Item Acesso aberto (Open Access) Inversão de dados sísmicos de reflexão a partir da curva do tempo de trânsito(Universidade Federal do Pará, 2007) PENHA, Lidiane Nazaré Monteiro; GOMES, Ellen de Nazaré Souza; http://lattes.cnpq.br/1134403454849660; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The present Masters Thesis had for objective the study of the seismic inversion problem based on flat reflectors for common-source (SC) and common- mid-point (CMP) gathers. The forward model is described by homogeneous, isotropic, plane horizontal layers. The problem is related to the NMO stack based on the optimization of the semblance function, for CMP sections corrected for moveout time. The study was based on two principles. The first principle adopted was the combination of two groups of inversion methods: A global and a local method. The second principle adopted was stripping according to the Wichert-Herglotz-Bateman theory, that establishes that to know a lower layer it is necessary to know first the upper layer. The application of the study aims at the seismic simulation of the terrestrial Solimões and marine sedimentary basins to obtain a 1D distribution of velocity and layer thicknesses of the subsurface of target horizons. In this sense, we limited the inversion experiments to 4 to 11 reflectors, once in practice the industry limits the interpretations to be to about one same number of 3 to 4 main reflectors. Stands out that this model is applicable as initial condition to the imaging of seismic sections in geologically complex regions wit h slow lateral variation of velocities. The synthetic data was produced based on geological information that corresponds to strong a priori information for the inversion model. For the construction of models related to the projects in progress, we analyzed the following relevant subjects: (1) Geology of sedimentary basins terrestrial Solimões and marine (stratigraphy, structural, tectonics and petroliferous); (2) Physics of the vertical and horizontal seismic resolution; and (3) Temporal-spatial discretization of the multi-coverage cube. The inversion process is dependent on the discretization of the wave field in time-space, on the physical parameters of the seismic survey, and of further on the resampling in the multiple coverage cube. The direct model us ed corresponds to the case of the NMO (1D) stack operator, considering a flat observation topography. The basic criterion taken as reference for the inversion and curve fit is the norm 2 (quadratic). The inversion using the simple present model is computational attractive for being fast, and convenient for allowing several other techniques be included with a logical physical interpretation; e.g., the Fresnel projected zone (ZFP), the direct calculation of the spherical divergence, Dix inversion, linear inversion by reparametrization, a priori information, and regularization. The ZFP shows to be a useful concept to establish the aperture of the spatial inversion window in the time-distance section. The ZFP represents the influence of the data in the horizontal resolution. The estimative of the ZFP indicates a minimum aperture based on an adopted model. The spherical divergence is a smooth function, and it has physical basis to be used in the definition of a data weight matrix for tomographic inversion methods. The necessity of robustness in the inversion can be analyzed in seismic sections (CS and CMP) submitted to filtering (corners frequencies: 5; 15; 75; 85; pass-band trapezoidal), where one can identify, compare and interpret the information contained. From the sections, we conclude that the data are contaminated with isolated points, what proposes methods in the class considered as robust having as reference the norm 2 (least-square) of curve fitting. The development of the algorithms used the FORTRAN 90/95 programming language, the program MATLAB for presentation of results; and the package CWP/SU for synthetic seismic modeling, picking of events and presentation of results.Item Acesso aberto (Open Access) Marcação e atenuação de múltiplas de superfície livre, processamento e imageamento em dados sísmicos marinhos(Universidade Federal do Pará, 2012) CARNEIRO, Raimundo Nonato Colares; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The present research aimed to the analysis and mitigation of multiple free surface, processing and imaging of marine seismic data to obtain useful images migrated to the geological interpretation focused on oil exploration. Attention has been paid to the systematic study of multiple unobstructed view of the prediction filter based on the theory of communication in order to better apply the filter deconvolution WH predictive, in processing stage subsequent to the NMO correction, although other methods may be considered as competitive. The identification and attenuation of multiple reflections in real seismic data remains a major challenge in seismic data processing, since they are considered as noise. However, this noise being classified as coherent, several techniques have been developed aiming at mitigation to avoid cascading errors in the later stages such as processing, scheduling events, tomographic inversion, imaging, and finally the geological interpretation of the images obtained. Another aspect in this study was to establish a flowchart of processing, imaging and attenuation of multiple free surface of a central step. Migrated sections were obtained in time and depth where it allows the interpretation. The development of this research was preformed with CWP/SU and MatLab packages.Item Acesso aberto (Open Access) NIP-tomografia usando método CRS e dados sísmicos marinhos(Universidade Federal do Pará, 2013) AFONSO, João Batista Rodrigues; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This work consisted on the application of techniques for processing, inversion and imaging of the Marmousoft synthetic data, and of the Jequitinhonha real data obtained on the eastern Atlantic continental shelf of the State of Bahia. The convencional NMO and CRS stack methods, and NIP-tomographic inversion were applied to the mentioned data. The NMO stack served to produce RMS and interval velocity distribution maps on the semblance domain. The CRS stack of both data we used for picking of re ection events to obtain the wave eld parameters that served to constrain the model as input for the NIP-tomographic inversion. The inversion characterizes as resulting in a smooth velocity model. Kirchhoff depth migration was used for verifying the obtained velocity models. We critically analyzed the applied techniques, and compared the CRS and the NMO stacks. The evolution of the visual quality of the obtained CRS and NMO sections were analyzed as measured by event continuity trace-by-trace and the signal/noise ratio. The di erences and improvements on the velocity model obtained by NIP-tomographic were also analyzed. The Kirchhoff prestack depth migration was applied aiming at geological interpretations, and to point out for better conditions of processing and imaging.Item Acesso aberto (Open Access) Processamento de dados sísmicos reais da região amazônica(Universidade Federal do Pará, 2006-05-17) GOMES, Anderson Batista; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The treatment of seismic data is divided basically in three parts: preprocessing and processing an imaging. In the present thesis we discuss the stages of preprocessing and two important methods of processing directed to the simulation of zero offset (ZO) sections from multiple coverage data. Conventional (NMO/DMO) processing, and the Common Reflection Surface (CRS) processing have been applied to seismic data from some seismic lines of group 204 of the data set of the Tacutu Graben (Brazil). We used the CWP/SU System to carried out the stages of preprocessing and the stage of conventional (NMO/DMO) processing. The CRS processing was carried out with WIT/CRS System. The stages of preprocessing consisted basically of three parts: organization of the geometry; zeroing and muting of noisy traces; and filtering in the temporal frequency (f filter) and velocity filter (f-k filter). Deconvolution was carried out, however due to results that did not bring any information; the results were not of further use. Besides, the elevation static correction was not applied because the topography is very smooth (elevation variation less than 20 m) in Tacutu plateau. The quality of the results of NMO/DMO processing was strongly biased due to the dependence of the method on a velocity model, that in this case it was accurate enough. We also found difficulties with the velocity analysis (VA) due to great amount of noise present in the data. As a consequence, the normal moveout correction (NMO) and migration did not generate better results. Based on the estimated attributes of the CRS stack method, a smooth macrovelocity model was obtained using reflection tomographic inversion. Using this macro-model, pre- and post- stack depth migration were carried out. Also, the CRS attributes are used in the method residual static correction, and the results demonstrate a better resolution of the stacked section. The sections resulting from stack and migration have been interpreted aiming at the delineation of structures. From the visual details of the panels, we have interpreted thinning, a main faulted anticline and discontinuity, and plays of horsts and grabens, and rollovers were traced. On the other hand, the basement could not easily be traced.Item Acesso aberto (Open Access) Processamento e imageamento de dados sísmicos marinhos(Universidade Federal do Pará, 2010) LIMA JÚNIOR, Hamilton Monteiro de; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617The work consisted of the processing and imaging of real seismic data obtained over the marine continental shelf of eastern Brazilian Atlantic of the Jequitinhonha Basin. The proprocessing stage was performed in partnership with the Seismic Research Group of Federal University of Bahia, that applied the SRME method for multiple attenuation of the data. In the sequel, this data was submitted to several processing techniques using the free package CWP/SU. The CRS processing stage consisted of the application of three workflows that used the CRS technology for simulating ZO sections. These workflows differed according to the sistematic inclusion of the processes named Residual Static Correction and Pre-stack Data Enhancement. The results of these three workflows were compared with each other to show the evolution of the visual quality of the resulting sections through the event continuity, as well as the signal-to-noise ratio. In addition to the stacked sections, it was also obtained the CRS migration sections that are intended to be used for plausible geological interpretation aiming at a possible indication for a successful drilling.Item Acesso aberto (Open Access) Processamento e imageamento NMO/CRS de dados sísmicos marinhos(Universidade Federal do Pará, 2010) NUNES, Fernando Sales Martins; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This work is devoted to processing and imaging of marine seismic reflection data selected from the Jequitinhonha Basin, Bahia, where two stack methods were used, the NMO-based and the CRS-based. It was also applied the CRS-partial prestack data enhancement for the densification of CMPs. Several tests were performed with these methods to optimize parameters of the stacking operators, and to improve the processing strategies. One of the efforts during processing was the attenuation of the surface-related multiple, what was attacked by applying the Radon and SRME techniques. It was also applied the densification of the data to improve the signal/noise ratio. As a conclusion, a strategy was chosen based on the comparative results of better visual and higher coherence values, having the CRS method presented superior results to the results with the NMO method, based on the visualization of reflection events that were not noticed in other sections.Item Acesso aberto (Open Access) Processamento e imageamento sísmico usando o CRS(Universidade Federal do Pará, 2014-02-04) PENA, Felipe Astur Valdes; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This work aimed at the application of the common-reflection-surface stack methods (CRSconventional), of the CRS-partial method, and of the NIP-tomography inversion method, to generate seismic data images for the interpretation related to geologically complex areas. The constructed model, and named Duveneck-Astur, was used to simulate a geological ambient formed by layers limited by smooth surface reflector interfaces, but that the paraxial ray theory was attended, differently from other synthetic common models where the presence of geological faults, and of high horizontal and vertical gradients exist, like in the Marmousi and the Sigsbee models, among others. To analyze comparatively the resolution of the applied methods, two tests were performed with the synthetic data. One test consisted of decimated data with random muting of traces in the CMP families, and another test with addition of noise. It was computer analyzed the behavior of the different stack methods to obtain a depth velocity distribution by NIP-tomography inversion, that uses the kinematic wavefield attribute constraints to estimate a velocity model consistent with the data. NIP-tomography results were mutually compared, and also to the velocity model obtained from semblance velocity analysis. The velocity distribution were used in the PSPI migration to verify consistency in the results.Item Acesso aberto (Open Access) Processamento, imageamento, interpretação e predição de pressão de dados sísmicos na bacia sedimentar do Jequitinhonha(Universidade Federal do Pará, 2016-08-18) SILVA, Aucilene de Nazaré Pimenta da; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This present work aimed to compose a seismic-stratigraphical framework for part of Jequitinhonha basin (marine east part of the State of Bahia), with data released for funded academic projects in progress. The study aims at the exploration of oil and gas, and corresponds to a proposal for for sedimentary basin revaluation. For this, the methodology is composed of velocity analysis, CRS stacking, migration, culminating in pressure prediction for the subsurface, where the aim is to map low (reservoir) and high (generator) pressure zones that act as natural pumps for fluid accumulation. The seismic data used in this study was provided by PETROBRAS for the Postgraduate Course in Geophysics (CPGF), of the Institute of Geosciences of the Federal University of Pará. The data were acquired by the PETROBRAS seismic team 214 in the offshore part or the basin. The available lines used in this study were the L214-266, L214-268, L214-270 and L214-297. Being able to separate them into two groups, we have three lines in the NE-SW directions (L214-266, L214-268, L214-270) and one in the direction NW-SE (L214-297). The velocity distributions used for the seismic sections were based on petrophysical information and empirical models, instead of using the subjective event picking in common-midpoint, stack or migration sections. The methodology presents as the first part the application of techniques based on the theory of the common reflection surface stack (CRS), which aims to generate seismic images of good quality for the interpretation of real data, and related to geologically complex media. The seismic-stratigraphic interpretation were performed using as a basis the geological information, making a correlation between the main reflectors (interfaces of higher impedance) and the stratigraphic units of the area. In this way, we constructed an empirical model for the velocity distribution (𝑣𝑅 and 𝑣𝑆) and density (𝜌) for the sections of the studied block. A higher study project aims to predict stress in sedimentary basins, as a contribution to the methods and techniques of geology and oil and gas engineering exploration. This subject is based on the knowledge of the compressional (𝑣𝑃 ) and shear (𝑣𝑆) velocities, and densities (𝜌), in order to locate areas low and high pressure zones in the subsurface, which serve as natural suction pumps for gas and oil accumulation. The theory is based on the elastodynamic equations, where the gravitational weight of the overload is responsible for the effects of strain-stress deformation in the subsurface. Therefore, to organize this problem requires the generalized Hooke’s law of linear elasticity. We presented details of the theoretical model, and an example to show how the pressure varies in the subsurface, where we highlight that pressure prediction does not necessarily increase linearly, but in a complex way that requires specific numerical formulas to be able to see important details. The applied theoretical model poses as pressure agent the vertical gravity load of the geological formations, and does not take into account the effects of curvatures, faulting and diagenesis. Also, the complex lateral tectonic events are not accounted for. The prediction of pressure and tension is an important issue for the analysis of sedimentary basins, aiming at mapping and extending potentially oil and gas productive areas. But an accurate prediction needs a 3D model for a significantly complete practical application.Item Acesso aberto (Open Access) Processamento, inversão e imageamento de dados sísmicos marinhos(Universidade Federal do Pará, 2012) SILVA, Douglas Augusto Barbosa da; LEITE, Lourenildo Williame Barbosa; http://lattes.cnpq.br/8588738536047617This work was there on the study and processing, inversion and imaging in time and in depthof Marmousi seismic data, and of the real Jequitinhonha obtained in the Bahia state Brazilian atlantic oriental continental platform. Were applied the NMO and CRS stack methods andthe niptomographic inversion technique of the cinematic attributes of the wave field. Withthe NMO stack was obtained a velocity distribution map throughout the velocity analysis onthe semblance coherence map and straight afterward the stacked and migrated sections in thedomain of the time and depth. The stack method of the common reflection surface (CRS)was applied with the crsstack-511 program to obtain the stacked and migrated sections inthe time domain and to extract the field wave parameters throughout the coherence analysis and of the redundancy obtained on the seismic data of multi-device. The niptomographicinversion phenomena of reflection took place throughout the aplication of the niptomo program,which is an implementation of the inversion method of the cinematic attributes ofthe NIP hypothetic wave, extracted straight from the CRS stack, to obtain an smoothedvelocity model, and subsequently, a migrated section in depth. The migrated sections areof the Kirchhoff kind. The techniques used follow a pre-determined and realized flowchartfollowing a file “makefile”, that works as an stage organizer. These stages were realized in theLinux desktop and in the Seismic Uni*x system of the Center for Wave Phenomena (CWP)of Colorado School of Mines. The results of the three techniques were compared with theaim of illustrating the evolution of the visual quality of the reluctant sections throughout theevents continuity trace-by-trace and the signal/noise relation, to analyze differences and improvementsin the migrated sections expecting a better geologic interpretation and organizebetter terms of processing and imaging, trying to aid possible well succeed drillings.